In this project we will try to predict closing weekly price of Corn Commodity Futures. In order to perform this prediction we will create a dataset that includes weekly Corn Futures closing prices as well as Long Open Interest and Short Open Interest of Processors/Users( sometimes they are called Commercials) from COT reports and by using this dataset we will try to predict next week’s prices.
Historical Futures Prices: Corn Futures, Continuous Contract #1. Non-adjusted price based on spot-month continuous contract calculations. Raw data from CME:
Can be found here
Commitment of Traders - CORN (CBT) - Futures Only (002602)
Can be found here
Data has been downloaded and stored in \Data folder:
import warnings
warnings.filterwarnings('ignore')
import pandas as pd
import numpy as np
from IPython.core.display import display, HTML
pd.options.display.max_colwidth = 500 # You need this, otherwise pandas
# will limit your HTML strings to 50 characters
pd.set_option('display.max_columns', None)
pd.set_option('display.max_rows', None)
pd.options.mode.chained_assignment = None # default='warn'
from matplotlib import pyplot
from sklearn.preprocessing import MinMaxScaler
from keras.models import Sequential
from keras.layers import Dense
from keras.layers import LSTM
from math import sqrt
from numpy import concatenate
from sklearn.metrics import mean_squared_error
import matplotlib.pyplot as plt
from plotly.offline import download_plotlyjs, init_notebook_mode, plot, iplot
import cufflinks as cf
import plotly.tools as tls
init_notebook_mode(connected=True)
cf.go_offline()
Using TensorFlow backend. C:\Users\zilvi\Anaconda3\envs\zil_tensorflow\lib\site-packages\plotly\graph_objs\_deprecations.py:558: DeprecationWarning: plotly.graph_objs.YAxis is deprecated. Please replace it with one of the following more specific types - plotly.graph_objs.layout.YAxis - plotly.graph_objs.layout.scene.YAxis C:\Users\zilvi\Anaconda3\envs\zil_tensorflow\lib\site-packages\plotly\graph_objs\_deprecations.py:531: DeprecationWarning: plotly.graph_objs.XAxis is deprecated. Please replace it with one of the following more specific types - plotly.graph_objs.layout.XAxis - plotly.graph_objs.layout.scene.XAxis
df_fut_orig = pd.read_csv('data\CHRIS-CME_C1.csv')
df_fut_orig.head(n=5)
| Date | Open | High | Low | Last | Change | Settle | Volume | Previous_Day_Open_Interest | |
|---|---|---|---|---|---|---|---|---|---|
| 0 | 2018-07-10 | 344.25 | 344.75 | 336.25 | 339.50 | 6.00 | 339.75 | 2668.0 | 2186.0 |
| 1 | 2018-07-09 | 346.00 | 348.50 | 342.50 | 346.00 | 6.00 | 345.75 | 3190.0 | 2969.0 |
| 2 | 2018-07-06 | 342.00 | 352.25 | 342.00 | 350.75 | 8.25 | 351.75 | 3068.0 | 3959.0 |
| 3 | 2018-07-05 | 345.50 | 348.75 | 341.50 | 342.50 | 0.75 | 343.50 | 3302.0 | 4812.0 |
| 4 | 2018-07-03 | 340.25 | 345.25 | 339.25 | 343.25 | 5.25 | 342.75 | 3048.0 | 5687.0 |
# Display a description of the dataset
display(df_fut_orig.describe())
| Open | High | Low | Last | Change | Settle | Volume | Previous_Day_Open_Interest | |
|---|---|---|---|---|---|---|---|---|
| count | 3033.000000 | 3034.000000 | 3034.000000 | 3034.000000 | 1081.000000 | 3034.000000 | 3034.000000 | 3034.00000 |
| mean | 457.095038 | 462.322924 | 451.795485 | 456.920040 | 3.950324 | 456.979318 | 103905.200396 | 352140.90145 |
| std | 140.338892 | 142.056030 | 138.436196 | 140.243019 | 3.415126 | 140.204571 | 73993.219920 | 248565.85531 |
| min | 219.000000 | 220.750000 | 216.750000 | 219.000000 | 0.000000 | 219.000000 | 0.000000 | 107.00000 |
| 25% | 360.000000 | 363.000000 | 356.250000 | 359.500000 | 1.500000 | 359.750000 | 40172.750000 | 107559.25000 |
| 50% | 388.500000 | 392.000000 | 383.500000 | 388.750000 | 3.000000 | 389.000000 | 102567.000000 | 365073.00000 |
| 75% | 565.500000 | 573.562500 | 557.375000 | 564.625000 | 5.500000 | 564.625000 | 152391.250000 | 556408.50000 |
| max | 830.250000 | 843.750000 | 822.750000 | 831.250000 | 30.750000 | 831.250000 | 538170.000000 | 858696.00000 |
df_fut_orig['Date'] = pd.to_datetime(df_fut_orig['Date'])
df_fut_orig.set_index('Date',inplace=True)
df_fut_orig = df_fut_orig.sort_values('Date')
Plot Corn Futures Price Series using Plotly
%load_ext autoreload
%autoreload 2
import visuals
visuals.plot_original_price_series(df_fut_orig)
Seems there are some rows where Volume=0, lets find out more about these rows
df_fut_orig[df_fut_orig['Volume']<1]
| Open | High | Low | Last | Change | Settle | Volume | Previous_Day_Open_Interest | |
|---|---|---|---|---|---|---|---|---|
| Date | ||||||||
| 2007-04-05 | 359.75 | 367.50 | 357.25 | 366.00 | NaN | 366.00 | 0.0 | 354349.0 |
| 2012-04-06 | 658.25 | 658.25 | 658.25 | 658.25 | NaN | 658.25 | 0.0 | 401521.0 |
| 2015-04-03 | 386.50 | 386.50 | 386.50 | 386.50 | NaN | 386.50 | 0.0 | 470964.0 |
Since we will resample daily prices into weekly prices , lets drop those rows.
# drop outliers
df_fut_orig.drop(df_fut_orig[df_fut_orig.Volume<1].index, inplace=True)
df_cot_orig = pd.read_csv('data\CFTC-002602_F_ALL.csv')
display(df_cot_orig.head())
| Date | Open_Interest | Producer_Merchant_Processor_User_Longs | Producer_Merchant_Processor_User_Shorts | Swap Dealer Longs | Swap Dealer Shorts | Swap Dealer Spreads | Money Manager Longs | Money Manager Shorts | Money Manager Spreads | Other Reportable Longs | Other Reportable Shorts | Other Reportable Spreads | Total Reportable Longs | Total Reportable Shorts | Non Reportable Longs | Non Reportable Shorts | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 2018-07-10 | 1818055.0 | 500172.0 | 750062.0 | 208128.0 | 39513.0 | 99477.0 | 263353.0 | 404297.0 | 154286.0 | 320946.0 | 70682.0 | 98709.0 | 1645071.0 | 1617026.0 | 172984.0 | 201029.0 |
| 1 | 2018-07-03 | 1830330.0 | 484257.0 | 773851.0 | 210341.0 | 36927.0 | 100340.0 | 274795.0 | 382191.0 | 149756.0 | 322256.0 | 66508.0 | 119627.0 | 1661372.0 | 1629200.0 | 168958.0 | 201130.0 |
| 2 | 2018-06-26 | 1885804.0 | 513100.0 | 840177.0 | 223131.0 | 32763.0 | 91972.0 | 287061.0 | 377825.0 | 153461.0 | 330396.0 | 58283.0 | 116745.0 | 1715866.0 | 1671226.0 | 169938.0 | 214578.0 |
| 3 | 2018-06-19 | 1992169.0 | 525197.0 | 920764.0 | 222105.0 | 41144.0 | 99285.0 | 299377.0 | 356828.0 | 163454.0 | 379025.0 | 56652.0 | 135078.0 | 1823521.0 | 1773205.0 | 168648.0 | 218964.0 |
| 4 | 2018-06-12 | 1963233.0 | 488666.0 | 917204.0 | 235249.0 | 37674.0 | 93281.0 | 292054.0 | 304292.0 | 172623.0 | 363918.0 | 65030.0 | 147098.0 | 1792889.0 | 1737202.0 | 170344.0 | 226031.0 |
display(df_cot_orig.describe())
| Open_Interest | Producer_Merchant_Processor_User_Longs | Producer_Merchant_Processor_User_Shorts | Swap Dealer Longs | Swap Dealer Shorts | Swap Dealer Spreads | Money Manager Longs | Money Manager Shorts | Money Manager Spreads | Other Reportable Longs | Other Reportable Shorts | Other Reportable Spreads | Total Reportable Longs | Total Reportable Shorts | Non Reportable Longs | Non Reportable Shorts | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| count | 6.310000e+02 | 631.000000 | 6.310000e+02 | 631.000000 | 631.000000 | 631.000000 | 631.000000 | 631.000000 | 631.000000 | 631.000000 | 631.000000 | 631.000000 | 6.310000e+02 | 6.310000e+02 | 631.000000 | 631.000000 |
| mean | 1.292201e+06 | 270795.049128 | 6.268425e+05 | 290792.497623 | 20337.034865 | 33260.068146 | 236884.269414 | 137472.426307 | 94546.356577 | 140931.890650 | 70914.334390 | 85505.109350 | 1.152715e+06 | 1.068878e+06 | 139485.541997 | 223322.976228 |
| std | 2.095471e+05 | 68976.221600 | 1.554272e+05 | 53203.484072 | 18944.008732 | 22912.567257 | 67454.195123 | 109465.025186 | 32739.133163 | 51939.690903 | 26360.863384 | 29682.425476 | 1.939790e+05 | 2.060080e+05 | 23718.957966 | 29824.710288 |
| min | 7.482520e+05 | 102373.000000 | 2.972960e+05 | 186981.000000 | 0.000000 | 4397.000000 | 96989.000000 | 6714.000000 | 29130.000000 | 49809.000000 | 25905.000000 | 27592.000000 | 6.379810e+05 | 5.689510e+05 | 78578.000000 | 156086.000000 |
| 25% | 1.192226e+06 | 226595.000000 | 5.235930e+05 | 255196.500000 | 6524.000000 | 13978.000000 | 186366.500000 | 47947.000000 | 72018.500000 | 104764.000000 | 53331.000000 | 62690.000000 | 1.055362e+06 | 9.573815e+05 | 121829.500000 | 198860.500000 |
| 50% | 1.301506e+06 | 262823.000000 | 6.112810e+05 | 276337.000000 | 15239.000000 | 27209.000000 | 225682.000000 | 95548.000000 | 91850.000000 | 140343.000000 | 66261.000000 | 82705.000000 | 1.166372e+06 | 1.067548e+06 | 136966.000000 | 227337.000000 |
| 75% | 1.398275e+06 | 314224.000000 | 7.058555e+05 | 321265.500000 | 28178.000000 | 48009.500000 | 287331.000000 | 211154.000000 | 113803.000000 | 175846.000000 | 83448.500000 | 106077.500000 | 1.247976e+06 | 1.180280e+06 | 153542.500000 | 246903.000000 |
| max | 1.992169e+06 | 525197.000000 | 1.001517e+06 | 422803.000000 | 95591.000000 | 113775.000000 | 431569.000000 | 447470.000000 | 231064.000000 | 379025.000000 | 173322.000000 | 181385.000000 | 1.825238e+06 | 1.773205e+06 | 206821.000000 | 293948.000000 |
df_fut=df_fut_orig.drop(columns=[clmn for i,clmn in enumerate(df_fut_orig.columns) if i not in [5,6,7] ],axis=1)
display(df_fut.head())
| Settle | Volume | Previous_Day_Open_Interest | |
|---|---|---|---|
| Date | |||
| 2006-06-16 | 235.50 | 56486.0 | 203491.0 |
| 2006-06-19 | 229.75 | 51299.0 | 190044.0 |
| 2006-06-20 | 229.75 | 41605.0 | 175859.0 |
| 2006-06-21 | 232.75 | 29803.0 | 162348.0 |
| 2006-06-22 | 230.50 | 28687.0 | 147658.0 |
s_settle =df_fut['Settle'].resample('W').last()
s_volume =df_fut['Volume'].resample('W').last()
df_fut_weekly = pd.concat([s_settle,s_volume], axis=1)
display(df_fut_weekly.head())
| Settle | Volume | |
|---|---|---|
| Date | ||
| 2006-06-18 | 235.50 | 56486.0 |
| 2006-06-25 | 228.25 | 28361.0 |
| 2006-07-02 | 235.50 | 30519.0 |
| 2006-07-09 | 241.00 | 13057.0 |
| 2006-07-16 | 253.50 | 2460.0 |
df_cot=df_cot_orig.drop(columns=[clmn for i,clmn in enumerate(df_cot_orig.columns) if i not in [0,1,2,3 ]],axis=1)
df_cot.rename(index=str, columns={"Producer_Merchant_Processor_User_Longs": "Longs", \
"Producer_Merchant_Processor_User_Shorts": "Shorts"},inplace=True)
df_cot['Date'] = pd.to_datetime(df_cot['Date'])
df_cot.set_index('Date',inplace=True)
display(df_cot.head())
| Open_Interest | Longs | Shorts | |
|---|---|---|---|
| Date | |||
| 2018-07-10 | 1818055.0 | 500172.0 | 750062.0 |
| 2018-07-03 | 1830330.0 | 484257.0 | 773851.0 |
| 2018-06-26 | 1885804.0 | 513100.0 | 840177.0 |
| 2018-06-19 | 1992169.0 | 525197.0 | 920764.0 |
| 2018-06-12 | 1963233.0 | 488666.0 | 917204.0 |
s_longs =df_cot['Longs'].resample('W').last()
s_shorts =df_cot['Shorts'].resample('W').last()
s_open_interest =df_cot['Open_Interest'].resample('W').last()
df_cot_weekly = pd.concat([s_open_interest,s_longs, s_shorts], axis=1)
display(df_cot_weekly.head(5))
| Open_Interest | Longs | Shorts | |
|---|---|---|---|
| Date | |||
| 2006-06-18 | 1320155.0 | 209662.0 | 699163.0 |
| 2006-06-25 | 1321520.0 | 224476.0 | 666688.0 |
| 2006-07-02 | 1329400.0 | 234769.0 | 645735.0 |
| 2006-07-09 | 1327482.0 | 220552.0 | 648405.0 |
| 2006-07-16 | 1333225.0 | 216968.0 | 673110.0 |
df_weekly = pd.merge(df_fut_weekly,df_cot_weekly, on='Date')
display(df_weekly.head(5))
| Settle | Volume | Open_Interest | Longs | Shorts | |
|---|---|---|---|---|---|
| Date | |||||
| 2006-06-18 | 235.50 | 56486.0 | 1320155.0 | 209662.0 | 699163.0 |
| 2006-06-25 | 228.25 | 28361.0 | 1321520.0 | 224476.0 | 666688.0 |
| 2006-07-02 | 235.50 | 30519.0 | 1329400.0 | 234769.0 | 645735.0 |
| 2006-07-09 | 241.00 | 13057.0 | 1327482.0 | 220552.0 | 648405.0 |
| 2006-07-16 | 253.50 | 2460.0 | 1333225.0 | 216968.0 | 673110.0 |
# Display a description of the dataset
display(df_weekly.describe())
| Settle | Volume | Open_Interest | Longs | Shorts | |
|---|---|---|---|---|---|
| count | 631.000000 | 631.000000 | 6.310000e+02 | 631.000000 | 6.310000e+02 |
| mean | 456.978605 | 100835.204437 | 1.292201e+06 | 270795.049128 | 6.268425e+05 |
| std | 140.242112 | 72466.341538 | 2.095471e+05 | 68976.221600 | 1.554272e+05 |
| min | 219.750000 | 132.000000 | 7.482520e+05 | 102373.000000 | 2.972960e+05 |
| 25% | 359.500000 | 34822.500000 | 1.192226e+06 | 226595.000000 | 5.235930e+05 |
| 50% | 389.250000 | 101209.000000 | 1.301506e+06 | 262823.000000 | 6.112810e+05 |
| 75% | 560.375000 | 150341.000000 | 1.398275e+06 | 314224.000000 | 7.058555e+05 |
| max | 824.500000 | 369522.000000 | 1.992169e+06 | 525197.000000 | 1.001517e+06 |
# rest index since we need row numbers for splitting
df_weekly_idx_date=df_weekly.copy()
df_weekly.reset_index(inplace=True)
%load_ext autoreload
%autoreload 2
import visuals
visuals.plot_weekly_combined_series_by_date(df_weekly)
The autoreload extension is already loaded. To reload it, use: %reload_ext autoreload
%load_ext autoreload
%autoreload 2
import visuals
visuals.plot_weekly_combined_series_by_trading_week(df_weekly)
The autoreload extension is already loaded. To reload it, use: %reload_ext autoreload
%load_ext autoreload
%autoreload 2
import visuals
visuals.plot_grouped_by_year_data(df_weekly_idx_date,"Stacked Plots of Price by Year")
The autoreload extension is already loaded. To reload it, use: %reload_ext autoreload
%load_ext autoreload
%autoreload 2
import visuals
visuals.lag_plot(df_weekly,"Lag Plot")
The autoreload extension is already loaded. To reload it, use: %reload_ext autoreload
scaler = MinMaxScaler(feature_range=(0, 1))
values = df_weekly.loc[:, df_weekly.columns != 'Date'].values
scaled = scaler.fit_transform(values)
validation_start=df_weekly[df_weekly['Date'] >= pd.to_datetime('2017-01-01')].index[0]
testing_start=df_weekly[df_weekly['Date'] >= pd.to_datetime('2018-01-01')].index[0]
print("validation start",validation_start)
print("testing start",testing_start)
validation start 550 testing start 603
# print data to double check
#print(df_weekly.iloc[validation_start])
#print(df_weekly.iloc[testing_start])
%load_ext autoreload
%autoreload 2
import data_preparer
reframed = data_preparer.series_to_supervised(scaled, 1, 1)
The autoreload extension is already loaded. To reload it, use: %reload_ext autoreload
# drop columns we don't want to predict
reframed.drop(reframed.columns[[6,7,8,9]], axis=1, inplace=True)
display(reframed.head())
| var1(t-1) | var2(t-1) | var3(t-1) | var4(t-1) | var5(t-1) | var1(t) | |
|---|---|---|---|---|---|---|
| 1 | 0.026044 | 0.152560 | 0.459760 | 0.253744 | 0.570655 | 0.014055 |
| 2 | 0.014055 | 0.076421 | 0.460857 | 0.288780 | 0.524540 | 0.026044 |
| 3 | 0.026044 | 0.082263 | 0.467192 | 0.313123 | 0.494786 | 0.035138 |
| 4 | 0.035138 | 0.034990 | 0.465650 | 0.279499 | 0.498578 | 0.055808 |
| 5 | 0.055808 | 0.006302 | 0.470267 | 0.271023 | 0.533659 | 0.028938 |
%load_ext autoreload
%autoreload 2
import data_preparer
train_X, train_y, validation_X, validation_y,test_X, test_y = data_preparer.split_data(reframed,validation_start,testing_start)
The autoreload extension is already loaded. To reload it, use: %reload_ext autoreload
%load_ext autoreload
%autoreload 2
import models
model,history=models.basic_lstm_model(train_X,train_y,validation_X,validation_y)
The autoreload extension is already loaded. To reload it, use: %reload_ext autoreload Train on 550 samples, validate on 53 samples Epoch 1/500 - 2s - loss: 0.4313 - val_loss: 0.2146 Epoch 2/500 - 0s - loss: 0.4074 - val_loss: 0.1917 Epoch 3/500 - 0s - loss: 0.3848 - val_loss: 0.1699 Epoch 4/500 - 0s - loss: 0.3635 - val_loss: 0.1487 Epoch 5/500 - 0s - loss: 0.3429 - val_loss: 0.1279 Epoch 6/500 - 0s - loss: 0.3228 - val_loss: 0.1071 Epoch 7/500 - 0s - loss: 0.3029 - val_loss: 0.0861 Epoch 8/500 - 0s - loss: 0.2831 - val_loss: 0.0645 Epoch 9/500 - 0s - loss: 0.2633 - val_loss: 0.0425 Epoch 10/500 - 0s - loss: 0.2448 - val_loss: 0.0266 Epoch 11/500 - 0s - loss: 0.2285 - val_loss: 0.0210 Epoch 12/500 - 0s - loss: 0.2166 - val_loss: 0.0248 Epoch 13/500 - 0s - loss: 0.2085 - val_loss: 0.0335 Epoch 14/500 - 0s - loss: 0.2028 - val_loss: 0.0442 Epoch 15/500 - 0s - loss: 0.1991 - val_loss: 0.0528 Epoch 16/500 - 0s - loss: 0.1967 - val_loss: 0.0593 Epoch 17/500 - 0s - loss: 0.1950 - val_loss: 0.0643 Epoch 18/500 - 0s - loss: 0.1937 - val_loss: 0.0681 Epoch 19/500 - 0s - loss: 0.1927 - val_loss: 0.0711 Epoch 20/500 - 0s - loss: 0.1918 - val_loss: 0.0740 Epoch 21/500 - 0s - loss: 0.1910 - val_loss: 0.0769 Epoch 22/500 - 0s - loss: 0.1901 - val_loss: 0.0796 Epoch 23/500 - 0s - loss: 0.1894 - val_loss: 0.0820 Epoch 24/500 - 0s - loss: 0.1886 - val_loss: 0.0843 Epoch 25/500 - 0s - loss: 0.1880 - val_loss: 0.0864 Epoch 26/500 - 0s - loss: 0.1873 - val_loss: 0.0885 Epoch 27/500 - 0s - loss: 0.1867 - val_loss: 0.0900 Epoch 28/500 - 0s - loss: 0.1861 - val_loss: 0.0913 Epoch 29/500 - 0s - loss: 0.1856 - val_loss: 0.0922 Epoch 30/500 - 0s - loss: 0.1852 - val_loss: 0.0932 Epoch 31/500 - 0s - loss: 0.1847 - val_loss: 0.0937 Epoch 32/500 - 0s - loss: 0.1843 - val_loss: 0.0942 Epoch 33/500 - 0s - loss: 0.1838 - val_loss: 0.0945 Epoch 34/500 - 0s - loss: 0.1834 - val_loss: 0.0948 Epoch 35/500 - 0s - loss: 0.1830 - val_loss: 0.0951 Epoch 36/500 - 0s - loss: 0.1826 - val_loss: 0.0954 Epoch 37/500 - 0s - loss: 0.1822 - val_loss: 0.0957 Epoch 38/500 - 0s - loss: 0.1818 - val_loss: 0.0958 Epoch 39/500 - 0s - loss: 0.1814 - val_loss: 0.0960 Epoch 40/500 - 0s - loss: 0.1810 - val_loss: 0.0962 Epoch 41/500 - 0s - loss: 0.1806 - val_loss: 0.0964 Epoch 42/500 - 0s - loss: 0.1801 - val_loss: 0.0967 Epoch 43/500 - 0s - loss: 0.1797 - val_loss: 0.0969 Epoch 44/500 - 0s - loss: 0.1793 - val_loss: 0.0972 Epoch 45/500 - 0s - loss: 0.1789 - val_loss: 0.0975 Epoch 46/500 - 0s - loss: 0.1785 - val_loss: 0.0979 Epoch 47/500 - 0s - loss: 0.1781 - val_loss: 0.0982 Epoch 48/500 - 0s - loss: 0.1777 - val_loss: 0.0986 Epoch 49/500 - 0s - loss: 0.1773 - val_loss: 0.0989 Epoch 50/500 - 0s - loss: 0.1768 - val_loss: 0.0992 Epoch 51/500 - 0s - loss: 0.1764 - val_loss: 0.0995 Epoch 52/500 - 0s - loss: 0.1760 - val_loss: 0.0997 Epoch 53/500 - 0s - loss: 0.1756 - val_loss: 0.0999 Epoch 54/500 - 0s - loss: 0.1752 - val_loss: 0.1001 Epoch 55/500 - 0s - loss: 0.1748 - val_loss: 0.1003 Epoch 56/500 - 0s - loss: 0.1743 - val_loss: 0.1004 Epoch 57/500 - 0s - loss: 0.1739 - val_loss: 0.1006 Epoch 58/500 - 0s - loss: 0.1735 - val_loss: 0.1007 Epoch 59/500 - 0s - loss: 0.1731 - val_loss: 0.1007 Epoch 60/500 - 0s - loss: 0.1726 - val_loss: 0.1008 Epoch 61/500 - 0s - loss: 0.1722 - val_loss: 0.1008 Epoch 62/500 - 0s - loss: 0.1718 - val_loss: 0.1009 Epoch 63/500 - 0s - loss: 0.1714 - val_loss: 0.1010 Epoch 64/500 - 0s - loss: 0.1709 - val_loss: 0.1012 Epoch 65/500 - 0s - loss: 0.1705 - val_loss: 0.1012 Epoch 66/500 - 0s - loss: 0.1700 - val_loss: 0.1013 Epoch 67/500 - 0s - loss: 0.1696 - val_loss: 0.1013 Epoch 68/500 - 0s - loss: 0.1691 - val_loss: 0.1014 Epoch 69/500 - 0s - loss: 0.1687 - val_loss: 0.1016 Epoch 70/500 - 0s - loss: 0.1682 - val_loss: 0.1018 Epoch 71/500 - 0s - loss: 0.1677 - val_loss: 0.1020 Epoch 72/500 - 0s - loss: 0.1673 - val_loss: 0.1022 Epoch 73/500 - 0s - loss: 0.1668 - val_loss: 0.1024 Epoch 74/500 - 0s - loss: 0.1663 - val_loss: 0.1026 Epoch 75/500 - 0s - loss: 0.1658 - val_loss: 0.1029 Epoch 76/500 - 0s - loss: 0.1653 - val_loss: 0.1031 Epoch 77/500 - 0s - loss: 0.1649 - val_loss: 0.1033 Epoch 78/500 - 0s - loss: 0.1644 - val_loss: 0.1034 Epoch 79/500 - 0s - loss: 0.1639 - val_loss: 0.1036 Epoch 80/500 - 0s - loss: 0.1634 - val_loss: 0.1037 Epoch 81/500 - 0s - loss: 0.1628 - val_loss: 0.1039 Epoch 82/500 - 0s - loss: 0.1623 - val_loss: 0.1040 Epoch 83/500 - 0s - loss: 0.1618 - val_loss: 0.1042 Epoch 84/500 - 0s - loss: 0.1613 - val_loss: 0.1044 Epoch 85/500 - 0s - loss: 0.1608 - val_loss: 0.1044 Epoch 86/500 - 0s - loss: 0.1603 - val_loss: 0.1045 Epoch 87/500 - 0s - loss: 0.1597 - val_loss: 0.1046 Epoch 88/500 - 0s - loss: 0.1592 - val_loss: 0.1047 Epoch 89/500 - 0s - loss: 0.1587 - val_loss: 0.1048 Epoch 90/500 - 0s - loss: 0.1581 - val_loss: 0.1049 Epoch 91/500 - 0s - loss: 0.1575 - val_loss: 0.1050 Epoch 92/500 - 0s - loss: 0.1570 - val_loss: 0.1049 Epoch 93/500 - 0s - loss: 0.1564 - val_loss: 0.1049 Epoch 94/500 - 0s - loss: 0.1559 - val_loss: 0.1050 Epoch 95/500 - 0s - loss: 0.1553 - val_loss: 0.1050 Epoch 96/500 - 0s - loss: 0.1547 - val_loss: 0.1050 Epoch 97/500 - 0s - loss: 0.1542 - val_loss: 0.1050 Epoch 98/500 - 0s - loss: 0.1536 - val_loss: 0.1051 Epoch 99/500 - 0s - loss: 0.1530 - val_loss: 0.1052 Epoch 100/500 - 0s - loss: 0.1524 - val_loss: 0.1052 Epoch 101/500 - 0s - loss: 0.1518 - val_loss: 0.1052 Epoch 102/500 - 0s - loss: 0.1512 - val_loss: 0.1052 Epoch 103/500 - 0s - loss: 0.1506 - val_loss: 0.1052 Epoch 104/500 - 0s - loss: 0.1500 - val_loss: 0.1052 Epoch 105/500 - 0s - loss: 0.1494 - val_loss: 0.1051 Epoch 106/500 - 0s - loss: 0.1488 - val_loss: 0.1051 Epoch 107/500 - 0s - loss: 0.1482 - val_loss: 0.1051 Epoch 108/500 - 0s - loss: 0.1475 - val_loss: 0.1052 Epoch 109/500 - 0s - loss: 0.1469 - val_loss: 0.1052 Epoch 110/500 - 0s - loss: 0.1462 - val_loss: 0.1052 Epoch 111/500 - 0s - loss: 0.1455 - val_loss: 0.1053 Epoch 112/500 - 0s - loss: 0.1449 - val_loss: 0.1055 Epoch 113/500 - 0s - loss: 0.1442 - val_loss: 0.1056 Epoch 114/500 - 0s - loss: 0.1435 - val_loss: 0.1058 Epoch 115/500 - 0s - loss: 0.1428 - val_loss: 0.1059 Epoch 116/500 - 0s - loss: 0.1421 - val_loss: 0.1060 Epoch 117/500 - 0s - loss: 0.1414 - val_loss: 0.1059 Epoch 118/500 - 0s - loss: 0.1407 - val_loss: 0.1057 Epoch 119/500 - 0s - loss: 0.1400 - val_loss: 0.1056 Epoch 120/500 - 0s - loss: 0.1392 - val_loss: 0.1054 Epoch 121/500 - 0s - loss: 0.1385 - val_loss: 0.1054 Epoch 122/500 - 0s - loss: 0.1378 - val_loss: 0.1055 Epoch 123/500 - 0s - loss: 0.1370 - val_loss: 0.1055 Epoch 124/500 - 0s - loss: 0.1363 - val_loss: 0.1056 Epoch 125/500 - 0s - loss: 0.1355 - val_loss: 0.1057 Epoch 126/500 - 0s - loss: 0.1347 - val_loss: 0.1058 Epoch 127/500 - 0s - loss: 0.1340 - val_loss: 0.1059 Epoch 128/500 - 0s - loss: 0.1332 - val_loss: 0.1059 Epoch 129/500 - 0s - loss: 0.1324 - val_loss: 0.1060 Epoch 130/500 - 0s - loss: 0.1316 - val_loss: 0.1060 Epoch 131/500 - 0s - loss: 0.1308 - val_loss: 0.1059 Epoch 132/500 - 0s - loss: 0.1301 - val_loss: 0.1058 Epoch 133/500 - 0s - loss: 0.1293 - val_loss: 0.1060 Epoch 134/500 - 0s - loss: 0.1284 - val_loss: 0.1061 Epoch 135/500 - 0s - loss: 0.1276 - val_loss: 0.1061 Epoch 136/500 - 0s - loss: 0.1268 - val_loss: 0.1060 Epoch 137/500 - 0s - loss: 0.1260 - val_loss: 0.1060 Epoch 138/500 - 0s - loss: 0.1252 - val_loss: 0.1058 Epoch 139/500 - 0s - loss: 0.1243 - val_loss: 0.1056 Epoch 140/500 - 0s - loss: 0.1235 - val_loss: 0.1054 Epoch 141/500 - 0s - loss: 0.1226 - val_loss: 0.1052 Epoch 142/500 - 0s - loss: 0.1218 - val_loss: 0.1049 Epoch 143/500 - 0s - loss: 0.1209 - val_loss: 0.1045 Epoch 144/500 - 0s - loss: 0.1200 - val_loss: 0.1041 Epoch 145/500 - 0s - loss: 0.1192 - val_loss: 0.1036 Epoch 146/500 - 0s - loss: 0.1183 - val_loss: 0.1030 Epoch 147/500 - 0s - loss: 0.1174 - val_loss: 0.1026 Epoch 148/500 - 0s - loss: 0.1165 - val_loss: 0.1023 Epoch 149/500 - 0s - loss: 0.1155 - val_loss: 0.1020 Epoch 150/500 - 0s - loss: 0.1146 - val_loss: 0.1017 Epoch 151/500 - 0s - loss: 0.1136 - val_loss: 0.1013 Epoch 152/500 - 0s - loss: 0.1127 - val_loss: 0.1008 Epoch 153/500 - 0s - loss: 0.1117 - val_loss: 0.1002 Epoch 154/500 - 0s - loss: 0.1107 - val_loss: 0.0996 Epoch 155/500 - 0s - loss: 0.1097 - val_loss: 0.0989 Epoch 156/500 - 0s - loss: 0.1087 - val_loss: 0.0982 Epoch 157/500 - 0s - loss: 0.1076 - val_loss: 0.0975 Epoch 158/500 - 0s - loss: 0.1066 - val_loss: 0.0966 Epoch 159/500 - 0s - loss: 0.1056 - val_loss: 0.0960 Epoch 160/500 - 0s - loss: 0.1045 - val_loss: 0.0955 Epoch 161/500 - 0s - loss: 0.1034 - val_loss: 0.0948 Epoch 162/500 - 0s - loss: 0.1023 - val_loss: 0.0939 Epoch 163/500 - 0s - loss: 0.1012 - val_loss: 0.0930 Epoch 164/500 - 0s - loss: 0.1001 - val_loss: 0.0920 Epoch 165/500 - 0s - loss: 0.0990 - val_loss: 0.0908 Epoch 166/500 - 0s - loss: 0.0979 - val_loss: 0.0894 Epoch 167/500 - 0s - loss: 0.0967 - val_loss: 0.0885 Epoch 168/500 - 0s - loss: 0.0956 - val_loss: 0.0878 Epoch 169/500 - 0s - loss: 0.0944 - val_loss: 0.0869 Epoch 170/500 - 0s - loss: 0.0933 - val_loss: 0.0860 Epoch 171/500 - 0s - loss: 0.0921 - val_loss: 0.0852 Epoch 172/500 - 0s - loss: 0.0908 - val_loss: 0.0845 Epoch 173/500 - 0s - loss: 0.0896 - val_loss: 0.0837 Epoch 174/500 - 0s - loss: 0.0883 - val_loss: 0.0827 Epoch 175/500 - 0s - loss: 0.0871 - val_loss: 0.0812 Epoch 176/500 - 0s - loss: 0.0859 - val_loss: 0.0797 Epoch 177/500 - 0s - loss: 0.0846 - val_loss: 0.0785 Epoch 178/500 - 0s - loss: 0.0833 - val_loss: 0.0772 Epoch 179/500 - 0s - loss: 0.0820 - val_loss: 0.0760 Epoch 180/500 - 0s - loss: 0.0807 - val_loss: 0.0747 Epoch 181/500 - 0s - loss: 0.0794 - val_loss: 0.0736 Epoch 182/500 - 0s - loss: 0.0781 - val_loss: 0.0725 Epoch 183/500 - 0s - loss: 0.0767 - val_loss: 0.0713 Epoch 184/500 - 0s - loss: 0.0753 - val_loss: 0.0698 Epoch 185/500 - 0s - loss: 0.0740 - val_loss: 0.0682 Epoch 186/500 - 0s - loss: 0.0726 - val_loss: 0.0666 Epoch 187/500 - 0s - loss: 0.0713 - val_loss: 0.0649 Epoch 188/500 - 0s - loss: 0.0699 - val_loss: 0.0630 Epoch 189/500 - 0s - loss: 0.0685 - val_loss: 0.0609 Epoch 190/500 - 0s - loss: 0.0671 - val_loss: 0.0587 Epoch 191/500 - 0s - loss: 0.0657 - val_loss: 0.0565 Epoch 192/500 - 0s - loss: 0.0644 - val_loss: 0.0543 Epoch 193/500 - 0s - loss: 0.0630 - val_loss: 0.0520 Epoch 194/500 - 0s - loss: 0.0616 - val_loss: 0.0500 Epoch 195/500 - 0s - loss: 0.0602 - val_loss: 0.0479 Epoch 196/500 - 0s - loss: 0.0589 - val_loss: 0.0463 Epoch 197/500 - 0s - loss: 0.0574 - val_loss: 0.0449 Epoch 198/500 - 0s - loss: 0.0560 - val_loss: 0.0427 Epoch 199/500 - 0s - loss: 0.0547 - val_loss: 0.0407 Epoch 200/500 - 0s - loss: 0.0534 - val_loss: 0.0390 Epoch 201/500 - 0s - loss: 0.0521 - val_loss: 0.0378 Epoch 202/500 - 0s - loss: 0.0507 - val_loss: 0.0365 Epoch 203/500 - 0s - loss: 0.0495 - val_loss: 0.0354 Epoch 204/500 - 0s - loss: 0.0482 - val_loss: 0.0343 Epoch 205/500 - 0s - loss: 0.0469 - val_loss: 0.0331 Epoch 206/500 - 0s - loss: 0.0458 - val_loss: 0.0316 Epoch 207/500 - 0s - loss: 0.0446 - val_loss: 0.0306 Epoch 208/500 - 0s - loss: 0.0434 - val_loss: 0.0295 Epoch 209/500 - 0s - loss: 0.0422 - val_loss: 0.0281 Epoch 210/500 - 0s - loss: 0.0412 - val_loss: 0.0266 Epoch 211/500 - 0s - loss: 0.0401 - val_loss: 0.0250 Epoch 212/500 - 0s - loss: 0.0392 - val_loss: 0.0234 Epoch 213/500 - 0s - loss: 0.0383 - val_loss: 0.0221 Epoch 214/500 - 0s - loss: 0.0374 - val_loss: 0.0212 Epoch 215/500 - 0s - loss: 0.0365 - val_loss: 0.0198 Epoch 216/500 - 0s - loss: 0.0356 - val_loss: 0.0184 Epoch 217/500 - 0s - loss: 0.0349 - val_loss: 0.0175 Epoch 218/500 - 0s - loss: 0.0341 - val_loss: 0.0168 Epoch 219/500 - 0s - loss: 0.0335 - val_loss: 0.0161 Epoch 220/500 - 0s - loss: 0.0329 - val_loss: 0.0155 Epoch 221/500 - 0s - loss: 0.0324 - val_loss: 0.0149 Epoch 222/500 - 0s - loss: 0.0319 - val_loss: 0.0145 Epoch 223/500 - 0s - loss: 0.0315 - val_loss: 0.0142 Epoch 224/500 - 0s - loss: 0.0312 - val_loss: 0.0138 Epoch 225/500 - 0s - loss: 0.0309 - val_loss: 0.0133 Epoch 226/500 - 0s - loss: 0.0306 - val_loss: 0.0130 Epoch 227/500 - 0s - loss: 0.0304 - val_loss: 0.0127 Epoch 228/500 - 0s - loss: 0.0302 - val_loss: 0.0125 Epoch 229/500 - 0s - loss: 0.0300 - val_loss: 0.0123 Epoch 230/500 - 0s - loss: 0.0298 - val_loss: 0.0121 Epoch 231/500 - 0s - loss: 0.0297 - val_loss: 0.0120 Epoch 232/500 - 0s - loss: 0.0296 - val_loss: 0.0119 Epoch 233/500 - 0s - loss: 0.0295 - val_loss: 0.0118 Epoch 234/500 - 0s - loss: 0.0294 - val_loss: 0.0117 Epoch 235/500 - 0s - loss: 0.0293 - val_loss: 0.0116 Epoch 236/500 - 0s - loss: 0.0292 - val_loss: 0.0115 Epoch 237/500 - 0s - loss: 0.0291 - val_loss: 0.0115 Epoch 238/500 - 0s - loss: 0.0291 - val_loss: 0.0114 Epoch 239/500 - 0s - loss: 0.0290 - val_loss: 0.0114 Epoch 240/500 - 0s - loss: 0.0290 - val_loss: 0.0113 Epoch 241/500 - 0s - loss: 0.0289 - val_loss: 0.0113 Epoch 242/500 - 0s - loss: 0.0289 - val_loss: 0.0112 Epoch 243/500 - 0s - loss: 0.0288 - val_loss: 0.0112 Epoch 244/500 - 0s - loss: 0.0288 - val_loss: 0.0112 Epoch 245/500 - 0s - loss: 0.0287 - val_loss: 0.0112 Epoch 246/500 - 0s - loss: 0.0287 - val_loss: 0.0112 Epoch 247/500 - 0s - loss: 0.0286 - val_loss: 0.0111 Epoch 248/500 - 0s - loss: 0.0286 - val_loss: 0.0111 Epoch 249/500 - 0s - loss: 0.0285 - val_loss: 0.0111 Epoch 250/500 - 0s - loss: 0.0285 - val_loss: 0.0111 Epoch 251/500 - 0s - loss: 0.0284 - val_loss: 0.0111 Epoch 252/500 - 0s - loss: 0.0284 - val_loss: 0.0111 Epoch 253/500 - 0s - loss: 0.0283 - val_loss: 0.0111 Epoch 254/500 - 0s - loss: 0.0284 - val_loss: 0.0111 Epoch 255/500 - 0s - loss: 0.0282 - val_loss: 0.0110 Epoch 256/500 - 0s - loss: 0.0283 - val_loss: 0.0110 Epoch 257/500 - 0s - loss: 0.0282 - val_loss: 0.0111 Epoch 258/500 - 0s - loss: 0.0282 - val_loss: 0.0110 Epoch 259/500 - 0s - loss: 0.0281 - val_loss: 0.0111 Epoch 260/500 - 0s - loss: 0.0282 - val_loss: 0.0110 Epoch 261/500 - 0s - loss: 0.0280 - val_loss: 0.0110 Epoch 262/500 - 0s - loss: 0.0281 - val_loss: 0.0110 Epoch 263/500 - 0s - loss: 0.0280 - val_loss: 0.0110 Epoch 264/500 - 0s - loss: 0.0280 - val_loss: 0.0110 Epoch 265/500 - 0s - loss: 0.0279 - val_loss: 0.0110 Epoch 266/500 - 0s - loss: 0.0280 - val_loss: 0.0110 Epoch 267/500 - 0s - loss: 0.0279 - val_loss: 0.0110 Epoch 268/500 - 0s - loss: 0.0280 - val_loss: 0.0110 Epoch 269/500 - 0s - loss: 0.0279 - val_loss: 0.0110 Epoch 270/500 - 0s - loss: 0.0279 - val_loss: 0.0110 Epoch 271/500 - 0s - loss: 0.0278 - val_loss: 0.0110 Epoch 272/500 - 0s - loss: 0.0279 - val_loss: 0.0110 Epoch 273/500 - 0s - loss: 0.0278 - val_loss: 0.0110 Epoch 274/500 - 0s - loss: 0.0278 - val_loss: 0.0110 Epoch 275/500 - 0s - loss: 0.0278 - val_loss: 0.0109 Epoch 276/500 - 0s - loss: 0.0278 - val_loss: 0.0110 Epoch 277/500 - 0s - loss: 0.0277 - val_loss: 0.0110 Epoch 278/500 - 0s - loss: 0.0278 - val_loss: 0.0110 Epoch 279/500 - 0s - loss: 0.0277 - val_loss: 0.0111 Epoch 280/500 - 0s - loss: 0.0277 - val_loss: 0.0110 Epoch 281/500 - 0s - loss: 0.0276 - val_loss: 0.0110 Epoch 282/500 - 0s - loss: 0.0277 - val_loss: 0.0110 Epoch 283/500 - 0s - loss: 0.0276 - val_loss: 0.0110 Epoch 284/500 - 0s - loss: 0.0276 - val_loss: 0.0109 Epoch 285/500 - 0s - loss: 0.0276 - val_loss: 0.0110 Epoch 286/500 - 0s - loss: 0.0276 - val_loss: 0.0110 Epoch 287/500 - 0s - loss: 0.0276 - val_loss: 0.0110 Epoch 288/500 - 0s - loss: 0.0276 - val_loss: 0.0110 Epoch 289/500 - 0s - loss: 0.0275 - val_loss: 0.0111 Epoch 290/500 - 0s - loss: 0.0276 - val_loss: 0.0110 Epoch 291/500 - 0s - loss: 0.0275 - val_loss: 0.0112 Epoch 292/500 - 0s - loss: 0.0276 - val_loss: 0.0110 Epoch 293/500 - 0s - loss: 0.0275 - val_loss: 0.0111 Epoch 294/500 - 0s - loss: 0.0276 - val_loss: 0.0110 Epoch 295/500 - 0s - loss: 0.0275 - val_loss: 0.0111 Epoch 296/500 - 0s - loss: 0.0276 - val_loss: 0.0110 Epoch 297/500 - 0s - loss: 0.0275 - val_loss: 0.0110 Epoch 298/500 - 0s - loss: 0.0275 - val_loss: 0.0110 Epoch 299/500 - 0s - loss: 0.0275 - val_loss: 0.0110 Epoch 300/500 - 0s - loss: 0.0275 - val_loss: 0.0110 Epoch 301/500 - 0s - loss: 0.0275 - val_loss: 0.0111 Epoch 302/500 - 0s - loss: 0.0275 - val_loss: 0.0110 Epoch 303/500 - 0s - loss: 0.0274 - val_loss: 0.0110 Epoch 304/500 - 0s - loss: 0.0275 - val_loss: 0.0110 Epoch 305/500 - 0s - loss: 0.0274 - val_loss: 0.0110 Epoch 306/500 - 0s - loss: 0.0275 - val_loss: 0.0110 Epoch 307/500 - 0s - loss: 0.0274 - val_loss: 0.0110 Epoch 308/500 - 0s - loss: 0.0275 - val_loss: 0.0110 Epoch 309/500 - 0s - loss: 0.0274 - val_loss: 0.0110 Epoch 310/500 - 0s - loss: 0.0275 - val_loss: 0.0110 Epoch 311/500 - 0s - loss: 0.0274 - val_loss: 0.0110 Epoch 312/500 - 0s - loss: 0.0275 - val_loss: 0.0110 Epoch 313/500 - 0s - loss: 0.0274 - val_loss: 0.0110 Epoch 314/500 - 0s - loss: 0.0275 - val_loss: 0.0110 Epoch 315/500 - 0s - loss: 0.0274 - val_loss: 0.0110 Epoch 316/500 - 0s - loss: 0.0275 - val_loss: 0.0110 Epoch 317/500 - 0s - loss: 0.0274 - val_loss: 0.0111 Epoch 318/500 - 0s - loss: 0.0275 - val_loss: 0.0110 Epoch 319/500 - 0s - loss: 0.0274 - val_loss: 0.0111 Epoch 320/500 - 0s - loss: 0.0275 - val_loss: 0.0110 Epoch 321/500 - 0s - loss: 0.0273 - val_loss: 0.0111 Epoch 322/500 - 0s - loss: 0.0274 - val_loss: 0.0110 Epoch 323/500 - 0s - loss: 0.0273 - val_loss: 0.0111 Epoch 324/500 - 0s - loss: 0.0274 - val_loss: 0.0110 Epoch 325/500 - 0s - loss: 0.0273 - val_loss: 0.0111 Epoch 326/500 - 0s - loss: 0.0274 - val_loss: 0.0110 Epoch 327/500 - 0s - loss: 0.0273 - val_loss: 0.0111 Epoch 328/500 - 0s - loss: 0.0274 - val_loss: 0.0110 Epoch 329/500 - 0s - loss: 0.0273 - val_loss: 0.0111 Epoch 330/500 - 0s - loss: 0.0274 - val_loss: 0.0110 Epoch 331/500 - 0s - loss: 0.0273 - val_loss: 0.0111 Epoch 332/500 - 0s - loss: 0.0273 - val_loss: 0.0109 Epoch 333/500 - 0s - loss: 0.0272 - val_loss: 0.0110 Epoch 334/500 - 0s - loss: 0.0272 - val_loss: 0.0110 Epoch 335/500 - 0s - loss: 0.0272 - val_loss: 0.0110 Epoch 336/500 - 0s - loss: 0.0273 - val_loss: 0.0109 Epoch 337/500 - 0s - loss: 0.0272 - val_loss: 0.0110 Epoch 338/500 - 0s - loss: 0.0272 - val_loss: 0.0109 Epoch 339/500 - 0s - loss: 0.0272 - val_loss: 0.0111 Epoch 340/500 - 0s - loss: 0.0272 - val_loss: 0.0109 Epoch 341/500 - 0s - loss: 0.0271 - val_loss: 0.0110 Epoch 342/500 - 0s - loss: 0.0272 - val_loss: 0.0109 Epoch 343/500 - 0s - loss: 0.0271 - val_loss: 0.0110 Epoch 344/500 - 0s - loss: 0.0272 - val_loss: 0.0109 Epoch 345/500 - 0s - loss: 0.0271 - val_loss: 0.0110 Epoch 346/500 - 0s - loss: 0.0271 - val_loss: 0.0109 Epoch 347/500 - 0s - loss: 0.0271 - val_loss: 0.0110 Epoch 348/500 - 0s - loss: 0.0271 - val_loss: 0.0109 Epoch 349/500 - 0s - loss: 0.0271 - val_loss: 0.0110 Epoch 350/500 - 0s - loss: 0.0271 - val_loss: 0.0109 Epoch 351/500 - 0s - loss: 0.0271 - val_loss: 0.0111 Epoch 352/500 - 0s - loss: 0.0271 - val_loss: 0.0109 Epoch 353/500 - 0s - loss: 0.0271 - val_loss: 0.0110 Epoch 354/500 - 0s - loss: 0.0271 - val_loss: 0.0110 Epoch 355/500 - 0s - loss: 0.0270 - val_loss: 0.0110 Epoch 356/500 - 0s - loss: 0.0270 - val_loss: 0.0110 Epoch 357/500 - 0s - loss: 0.0271 - val_loss: 0.0109 Epoch 358/500 - 0s - loss: 0.0270 - val_loss: 0.0110 Epoch 359/500 - 0s - loss: 0.0271 - val_loss: 0.0109 Epoch 360/500 - 0s - loss: 0.0270 - val_loss: 0.0111 Epoch 361/500 - 0s - loss: 0.0270 - val_loss: 0.0110 Epoch 362/500 - 0s - loss: 0.0270 - val_loss: 0.0111 Epoch 363/500 - 0s - loss: 0.0270 - val_loss: 0.0111 Epoch 364/500 - 0s - loss: 0.0270 - val_loss: 0.0110 Epoch 365/500 - 0s - loss: 0.0270 - val_loss: 0.0110 Epoch 366/500 - 0s - loss: 0.0270 - val_loss: 0.0110 Epoch 367/500 - 0s - loss: 0.0270 - val_loss: 0.0110 Epoch 368/500 - 0s - loss: 0.0270 - val_loss: 0.0111 Epoch 369/500 - 0s - loss: 0.0270 - val_loss: 0.0110 Epoch 370/500 - 0s - loss: 0.0270 - val_loss: 0.0111 Epoch 371/500 - 0s - loss: 0.0270 - val_loss: 0.0111 Epoch 372/500 - 0s - loss: 0.0270 - val_loss: 0.0111 Epoch 373/500 - 0s - loss: 0.0269 - val_loss: 0.0111 Epoch 374/500 - 0s - loss: 0.0269 - val_loss: 0.0111 Epoch 375/500 - 0s - loss: 0.0270 - val_loss: 0.0110 Epoch 376/500 - 0s - loss: 0.0269 - val_loss: 0.0110 Epoch 377/500 - 0s - loss: 0.0269 - val_loss: 0.0111 Epoch 378/500 - 0s - loss: 0.0269 - val_loss: 0.0111 Epoch 379/500 - 0s - loss: 0.0269 - val_loss: 0.0111 Epoch 380/500 - 0s - loss: 0.0269 - val_loss: 0.0111 Epoch 381/500 - 0s - loss: 0.0269 - val_loss: 0.0111 Epoch 382/500 - 0s - loss: 0.0269 - val_loss: 0.0111 Epoch 383/500 - 0s - loss: 0.0269 - val_loss: 0.0111 Epoch 384/500 - 0s - loss: 0.0269 - val_loss: 0.0110 Epoch 385/500 - 0s - loss: 0.0269 - val_loss: 0.0111 Epoch 386/500 - 0s - loss: 0.0269 - val_loss: 0.0110 Epoch 387/500 - 0s - loss: 0.0269 - val_loss: 0.0111 Epoch 388/500 - 0s - loss: 0.0269 - val_loss: 0.0110 Epoch 389/500 - 0s - loss: 0.0269 - val_loss: 0.0110 Epoch 390/500 - 0s - loss: 0.0269 - val_loss: 0.0112 Epoch 391/500 - 0s - loss: 0.0269 - val_loss: 0.0110 Epoch 392/500 - 0s - loss: 0.0268 - val_loss: 0.0110 Epoch 393/500 - 0s - loss: 0.0268 - val_loss: 0.0111 Epoch 394/500 - 0s - loss: 0.0269 - val_loss: 0.0110 Epoch 395/500 - 0s - loss: 0.0268 - val_loss: 0.0110 Epoch 396/500 - 0s - loss: 0.0268 - val_loss: 0.0111 Epoch 397/500 - 0s - loss: 0.0269 - val_loss: 0.0110 Epoch 398/500 - 0s - loss: 0.0268 - val_loss: 0.0109 Epoch 399/500 - 0s - loss: 0.0268 - val_loss: 0.0111 Epoch 400/500 - 0s - loss: 0.0268 - val_loss: 0.0110 Epoch 401/500 - 0s - loss: 0.0268 - val_loss: 0.0110 Epoch 402/500 - 0s - loss: 0.0268 - val_loss: 0.0110 Epoch 403/500 - 0s - loss: 0.0268 - val_loss: 0.0110 Epoch 404/500 - 0s - loss: 0.0268 - val_loss: 0.0111 Epoch 405/500 - 0s - loss: 0.0268 - val_loss: 0.0110 Epoch 406/500 - 0s - loss: 0.0268 - val_loss: 0.0111 Epoch 407/500 - 0s - loss: 0.0268 - val_loss: 0.0110 Epoch 408/500 - 0s - loss: 0.0268 - val_loss: 0.0110 Epoch 409/500 - 0s - loss: 0.0268 - val_loss: 0.0111 Epoch 410/500 - 0s - loss: 0.0268 - val_loss: 0.0110 Epoch 411/500 - 0s - loss: 0.0268 - val_loss: 0.0110 Epoch 412/500 - 0s - loss: 0.0268 - val_loss: 0.0110 Epoch 413/500 - 0s - loss: 0.0268 - val_loss: 0.0111 Epoch 414/500 - 0s - loss: 0.0268 - val_loss: 0.0110 Epoch 415/500 - 0s - loss: 0.0267 - val_loss: 0.0109 Epoch 416/500 - 0s - loss: 0.0267 - val_loss: 0.0111 Epoch 417/500 - 0s - loss: 0.0267 - val_loss: 0.0110 Epoch 418/500 - 0s - loss: 0.0267 - val_loss: 0.0111 Epoch 419/500 - 0s - loss: 0.0267 - val_loss: 0.0110 Epoch 420/500 - 0s - loss: 0.0267 - val_loss: 0.0110 Epoch 421/500 - 0s - loss: 0.0267 - val_loss: 0.0111 Epoch 422/500 - 0s - loss: 0.0267 - val_loss: 0.0110 Epoch 423/500 - 0s - loss: 0.0267 - val_loss: 0.0109 Epoch 424/500 - 0s - loss: 0.0267 - val_loss: 0.0111 Epoch 425/500 - 0s - loss: 0.0267 - val_loss: 0.0110 Epoch 426/500 - 0s - loss: 0.0267 - val_loss: 0.0109 Epoch 427/500 - 0s - loss: 0.0267 - val_loss: 0.0110 Epoch 428/500 - 0s - loss: 0.0267 - val_loss: 0.0110 Epoch 429/500 - 0s - loss: 0.0267 - val_loss: 0.0110 Epoch 430/500 - 0s - loss: 0.0267 - val_loss: 0.0110 Epoch 431/500 - 0s - loss: 0.0267 - val_loss: 0.0110 Epoch 432/500 - 0s - loss: 0.0267 - val_loss: 0.0109 Epoch 433/500 - 0s - loss: 0.0267 - val_loss: 0.0110 Epoch 434/500 - 0s - loss: 0.0267 - val_loss: 0.0110 Epoch 435/500 - 0s - loss: 0.0267 - val_loss: 0.0110 Epoch 436/500 - 0s - loss: 0.0267 - val_loss: 0.0110 Epoch 437/500 - 0s - loss: 0.0267 - val_loss: 0.0110 Epoch 438/500 - 0s - loss: 0.0267 - val_loss: 0.0109 Epoch 439/500 - 0s - loss: 0.0266 - val_loss: 0.0111 Epoch 440/500 - 0s - loss: 0.0267 - val_loss: 0.0110 Epoch 441/500 - 0s - loss: 0.0267 - val_loss: 0.0110 Epoch 442/500 - 0s - loss: 0.0266 - val_loss: 0.0110 Epoch 443/500 - 0s - loss: 0.0266 - val_loss: 0.0111 Epoch 444/500 - 0s - loss: 0.0266 - val_loss: 0.0110 Epoch 445/500 - 0s - loss: 0.0266 - val_loss: 0.0110 Epoch 446/500 - 0s - loss: 0.0266 - val_loss: 0.0110 Epoch 447/500 - 0s - loss: 0.0266 - val_loss: 0.0110 Epoch 448/500 - 0s - loss: 0.0266 - val_loss: 0.0109 Epoch 449/500 - 0s - loss: 0.0266 - val_loss: 0.0111 Epoch 450/500 - 0s - loss: 0.0266 - val_loss: 0.0110 Epoch 451/500 - 0s - loss: 0.0266 - val_loss: 0.0109 Epoch 452/500 - 0s - loss: 0.0266 - val_loss: 0.0112 Epoch 453/500 - 0s - loss: 0.0266 - val_loss: 0.0110 Epoch 454/500 - 0s - loss: 0.0266 - val_loss: 0.0110 Epoch 455/500 - 0s - loss: 0.0266 - val_loss: 0.0110 Epoch 456/500 - 0s - loss: 0.0266 - val_loss: 0.0110 Epoch 457/500 - 0s - loss: 0.0266 - val_loss: 0.0111 Epoch 458/500 - 0s - loss: 0.0266 - val_loss: 0.0110 Epoch 459/500 - 0s - loss: 0.0266 - val_loss: 0.0109 Epoch 460/500 - 0s - loss: 0.0266 - val_loss: 0.0111 Epoch 461/500 - 0s - loss: 0.0266 - val_loss: 0.0111 Epoch 462/500 - 0s - loss: 0.0266 - val_loss: 0.0110 Epoch 463/500 - 0s - loss: 0.0266 - val_loss: 0.0111 Epoch 464/500 - 0s - loss: 0.0266 - val_loss: 0.0110 Epoch 465/500 - 0s - loss: 0.0266 - val_loss: 0.0110 Epoch 466/500 - 0s - loss: 0.0266 - val_loss: 0.0111 Epoch 467/500 - 0s - loss: 0.0266 - val_loss: 0.0110 Epoch 468/500 - 0s - loss: 0.0266 - val_loss: 0.0110 Epoch 469/500 - 0s - loss: 0.0266 - val_loss: 0.0110 Epoch 470/500 - 0s - loss: 0.0266 - val_loss: 0.0110 Epoch 471/500 - 0s - loss: 0.0266 - val_loss: 0.0110 Epoch 472/500 - 0s - loss: 0.0265 - val_loss: 0.0110 Epoch 473/500 - 0s - loss: 0.0265 - val_loss: 0.0111 Epoch 474/500 - 0s - loss: 0.0265 - val_loss: 0.0111 Epoch 475/500 - 0s - loss: 0.0265 - val_loss: 0.0110 Epoch 476/500 - 0s - loss: 0.0265 - val_loss: 0.0110 Epoch 477/500 - 0s - loss: 0.0265 - val_loss: 0.0110 Epoch 478/500 - 0s - loss: 0.0265 - val_loss: 0.0111 Epoch 479/500 - 0s - loss: 0.0265 - val_loss: 0.0111 Epoch 480/500 - 0s - loss: 0.0265 - val_loss: 0.0111 Epoch 481/500 - 0s - loss: 0.0265 - val_loss: 0.0109 Epoch 482/500 - 0s - loss: 0.0265 - val_loss: 0.0111 Epoch 483/500 - 0s - loss: 0.0265 - val_loss: 0.0111 Epoch 484/500 - 0s - loss: 0.0265 - val_loss: 0.0110 Epoch 485/500 - 0s - loss: 0.0265 - val_loss: 0.0109 Epoch 486/500 - 0s - loss: 0.0265 - val_loss: 0.0110 Epoch 487/500 - 0s - loss: 0.0265 - val_loss: 0.0111 Epoch 488/500 - 0s - loss: 0.0265 - val_loss: 0.0111 Epoch 489/500 - 0s - loss: 0.0265 - val_loss: 0.0110 Epoch 490/500 - 0s - loss: 0.0265 - val_loss: 0.0110 Epoch 491/500 - 0s - loss: 0.0265 - val_loss: 0.0111 Epoch 492/500 - 0s - loss: 0.0265 - val_loss: 0.0111 Epoch 493/500 - 0s - loss: 0.0265 - val_loss: 0.0111 Epoch 494/500 - 0s - loss: 0.0265 - val_loss: 0.0110 Epoch 495/500 - 0s - loss: 0.0265 - val_loss: 0.0110 Epoch 496/500 - 0s - loss: 0.0265 - val_loss: 0.0110 Epoch 497/500 - 0s - loss: 0.0265 - val_loss: 0.0111 Epoch 498/500 - 0s - loss: 0.0265 - val_loss: 0.0111 Epoch 499/500 - 0s - loss: 0.0265 - val_loss: 0.0110 Epoch 500/500 - 0s - loss: 0.0265 - val_loss: 0.0110
pyplot.plot(history['loss'], label='train')
pyplot.plot(history['val_loss'], label='validation')
pyplot.legend()
pyplot.show()
# make a prediction
%load_ext autoreload
%autoreload 2
import models
inv_yhat, inv_y, rmse=models.make_lstm_prediction(validation_X,validation_y,model,scaler)
print('LSTM Model on Validation Data RMSE: %.3f' % rmse)
The autoreload extension is already loaded. To reload it, use: %reload_ext autoreload LSTM Model on Validation Data RMSE: 8.663
%load_ext autoreload
%autoreload 2
import visuals
visuals.plot_series_to_compare(inv_y,inv_yhat,"Actual Price","Predicted Price", "Actual Price Versus LSTM Predicted Price")
The autoreload extension is already loaded. To reload it, use: %reload_ext autoreload
In this section we will check our bench mark model. As is proposed in my proposal my bench mark model is a simple linear regressor model.
from pandas import read_csv
from pandas import datetime
from pandas import DataFrame
from pandas import concat
from matplotlib import pyplot
from sklearn.metrics import mean_squared_error
from math import sqrt
# Create lagged dataset
values = pd.DataFrame(df_weekly["Settle"].values)
df_benchmark = concat([values.shift(1), values], axis=1)
df_benchmark.columns = ['t', 't+1']
display(df_benchmark.head(5))
| t | t+1 | |
|---|---|---|
| 0 | NaN | 235.50 |
| 1 | 235.50 | 228.25 |
| 2 | 228.25 | 235.50 |
| 3 | 235.50 | 241.00 |
| 4 | 241.00 | 253.50 |
# split into train , validation and test sets
X = df_benchmark.values
train, validation, test = X[1:validation_start], X[validation_start:testing_start],X[testing_start:]
train_bench_X, train_bench_y = train[:,0], train[:,1]
validation_bench_X, validation_bench_y = validation[:,0], validation[:,1]
test_bench_X, test_bench_y = test[:,0], test[:,1]
%load_ext autoreload
%autoreload 2
import models
The autoreload extension is already loaded. To reload it, use: %reload_ext autoreload
# make a prediction
%load_ext autoreload
%autoreload 2
import models
predictions,rmse=models.make_benchmark_model_prediction(validation_bench_X,validation_bench_y)
print('Benchmark Model on Validation Data RMSE: %.3f' % rmse)
The autoreload extension is already loaded. To reload it, use: %reload_ext autoreload Benchmark Model on Validation Data RMSE: 8.750
%load_ext autoreload
%autoreload 2
import visuals
visuals.plot_series_to_compare(validation_bench_y,predictions,"Actual Price","Predicted Price", "Actual Price Versus Benchmark Model Predicted Price")
The autoreload extension is already loaded. To reload it, use: %reload_ext autoreload
# make a prediction
%load_ext autoreload
%autoreload 2
import models
inv_yhat, inv_y, rmse=models.make_lstm_prediction(test_X,test_y,model,scaler)
print('LSTM Moddel on Test Data RMSE: %.3f' % rmse)
The autoreload extension is already loaded. To reload it, use: %reload_ext autoreload LSTM Moddel on Test Data RMSE: 8.950
# make a prediction
%load_ext autoreload
%autoreload 2
import models
predictions,rmse=models.make_benchmark_model_prediction(test_bench_X,test_bench_y)
print('Benchmark Model on Test Data RMSE: %.3f' % rmse)
The autoreload extension is already loaded. To reload it, use: %reload_ext autoreload Benchmark Model on Test Data RMSE: 8.293
%load_ext autoreload
%autoreload 2
import tune_model
tune_model.tune_memmory_cells(train_X,train_y,validation_X,validation_y)
The autoreload extension is already loaded. To reload it, use:
%reload_ext autoreload
>1/5 param=1.000000, loss=0.012129
>2/5 param=1.000000, loss=0.014510
>3/5 param=1.000000, loss=0.011251
>4/5 param=1.000000, loss=0.013165
>5/5 param=1.000000, loss=0.012103
>1/5 param=5.000000, loss=0.011613
>2/5 param=5.000000, loss=0.012066
>3/5 param=5.000000, loss=0.011987
>4/5 param=5.000000, loss=0.012024
>5/5 param=5.000000, loss=0.012577
>1/5 param=10.000000, loss=0.012330
>2/5 param=10.000000, loss=0.013115
>3/5 param=10.000000, loss=0.013052
>4/5 param=10.000000, loss=0.011792
>5/5 param=10.000000, loss=0.013219
>1/5 param=25.000000, loss=0.011451
>2/5 param=25.000000, loss=0.013046
>3/5 param=25.000000, loss=0.011217
>4/5 param=25.000000, loss=0.011381
>5/5 param=25.000000, loss=0.011058
>1/5 param=50.000000, loss=0.012644
>2/5 param=50.000000, loss=0.012646
>3/5 param=50.000000, loss=0.011140
>4/5 param=50.000000, loss=0.013345
>5/5 param=50.000000, loss=0.012515
>1/5 param=100.000000, loss=0.011604
>2/5 param=100.000000, loss=0.015141
>3/5 param=100.000000, loss=0.012493
>4/5 param=100.000000, loss=0.012222
>5/5 param=100.000000, loss=0.013316
>1/5 param=200.000000, loss=0.011432
>2/5 param=200.000000, loss=0.012943
>3/5 param=200.000000, loss=0.010766
>4/5 param=200.000000, loss=0.014985
>5/5 param=200.000000, loss=0.011893
1 5 10 25 50 100 200
count 5.000000 5.000000 5.000000 5.000000 5.000000 5.000000 5.000000
mean 0.012632 0.012053 0.012702 0.011631 0.012458 0.012955 0.012404
std 0.001250 0.000344 0.000618 0.000806 0.000805 0.001368 0.001646
min 0.011251 0.011613 0.011792 0.011058 0.011140 0.011604 0.010766
25% 0.012103 0.011987 0.012330 0.011217 0.012515 0.012222 0.011432
50% 0.012129 0.012024 0.013052 0.011381 0.012644 0.012493 0.011893
75% 0.013165 0.012066 0.013115 0.011451 0.012646 0.013316 0.012943
max 0.014510 0.012577 0.013219 0.013046 0.013345 0.015141 0.014985
%load_ext autoreload
%autoreload 2
import tune_model
tune_model.tune_batch_size(train_X,train_y,validation_X,validation_y)
The autoreload extension is already loaded. To reload it, use:
%reload_ext autoreload
>1/5 param=2.000000, loss=0.017062
>2/5 param=2.000000, loss=0.017288
>3/5 param=2.000000, loss=0.019628
>4/5 param=2.000000, loss=0.017816
>5/5 param=2.000000, loss=0.019510
>1/5 param=4.000000, loss=0.012946
>2/5 param=4.000000, loss=0.013468
>3/5 param=4.000000, loss=0.012065
>4/5 param=4.000000, loss=0.012588
>5/5 param=4.000000, loss=0.013342
>1/5 param=8.000000, loss=0.014913
>2/5 param=8.000000, loss=0.016104
>3/5 param=8.000000, loss=0.015724
>4/5 param=8.000000, loss=0.015701
>5/5 param=8.000000, loss=0.014112
>1/5 param=32.000000, loss=0.011369
>2/5 param=32.000000, loss=0.011775
>3/5 param=32.000000, loss=0.012824
>4/5 param=32.000000, loss=0.012704
>5/5 param=32.000000, loss=0.011133
>1/5 param=64.000000, loss=0.011609
>2/5 param=64.000000, loss=0.011532
>3/5 param=64.000000, loss=0.013435
>4/5 param=64.000000, loss=0.011951
>5/5 param=64.000000, loss=0.012349
>1/5 param=128.000000, loss=0.011928
>2/5 param=128.000000, loss=0.012988
>3/5 param=128.000000, loss=0.011940
>4/5 param=128.000000, loss=0.011974
>5/5 param=128.000000, loss=0.011488
>1/5 param=256.000000, loss=0.011780
>2/5 param=256.000000, loss=0.013215
>3/5 param=256.000000, loss=0.012355
>4/5 param=256.000000, loss=0.011390
>5/5 param=256.000000, loss=0.011595
2 4 8 32 64 128 256
count 5.000000 5.000000 5.000000 5.000000 5.000000 5.000000 5.000000
mean 0.018261 0.012882 0.015311 0.011961 0.012175 0.012064 0.012067
std 0.001226 0.000573 0.000798 0.000769 0.000775 0.000554 0.000736
min 0.017062 0.012065 0.014112 0.011133 0.011532 0.011488 0.011390
25% 0.017288 0.012588 0.014913 0.011369 0.011609 0.011928 0.011595
50% 0.017816 0.012946 0.015701 0.011775 0.011951 0.011940 0.011780
75% 0.019510 0.013342 0.015724 0.012704 0.012349 0.011974 0.012355
max 0.019628 0.013468 0.016104 0.012824 0.013435 0.012988 0.013215
%load_ext autoreload
%autoreload 2
import tune_model
tune_model.tune_learning_rate(train_X,train_y,validation_X,validation_y)
The autoreload extension is already loaded. To reload it, use:
%reload_ext autoreload
>1/5 param=0.100000, loss=0.023888
>2/5 param=0.100000, loss=0.011819
>3/5 param=0.100000, loss=0.019747
>4/5 param=0.100000, loss=0.036420
>5/5 param=0.100000, loss=0.012148
>1/5 param=0.001000, loss=0.011653
>2/5 param=0.001000, loss=0.012345
>3/5 param=0.001000, loss=0.011941
>4/5 param=0.001000, loss=0.011418
>5/5 param=0.001000, loss=0.011669
>1/5 param=0.000100, loss=0.038567
>2/5 param=0.000100, loss=0.033610
>3/5 param=0.000100, loss=0.049779
>4/5 param=0.000100, loss=0.035971
>5/5 param=0.000100, loss=0.043945
0.1 0.001 0.0001
count 5.000000 5.000000 5.000000
mean 0.020805 0.011805 0.040375
std 0.010126 0.000354 0.006512
min 0.011819 0.011418 0.033610
25% 0.012148 0.011653 0.035971
50% 0.019747 0.011669 0.038567
75% 0.023888 0.011941 0.043945
max 0.036420 0.012345 0.049779
%load_ext autoreload
%autoreload 2
import tune_model
tune_model.tune_weight_regularization(train_X,train_y,validation_X,validation_y)
The autoreload extension is already loaded. To reload it, use:
%reload_ext autoreload
>1/5 param=1.000000, loss=0.011683
>2/5 param=1.000000, loss=0.011673
>3/5 param=1.000000, loss=0.012200
>4/5 param=1.000000, loss=0.011844
>5/5 param=1.000000, loss=0.012050
>1/5 param=2.000000, loss=0.036623
>2/5 param=2.000000, loss=0.036176
>3/5 param=2.000000, loss=0.034430
>4/5 param=2.000000, loss=0.035556
>5/5 param=2.000000, loss=0.032201
>1/5 param=3.000000, loss=0.018865
>2/5 param=3.000000, loss=0.019171
>3/5 param=3.000000, loss=0.019481
>4/5 param=3.000000, loss=0.019422
>5/5 param=3.000000, loss=0.018394
>1/5 param=4.000000, loss=0.038356
>2/5 param=4.000000, loss=0.037684
>3/5 param=4.000000, loss=0.038813
>4/5 param=4.000000, loss=0.038209
>5/5 param=4.000000, loss=0.036820
1 2 3 4
count 5.000000 5.000000 5.000000 5.000000
mean 0.011890 0.034997 0.019066 0.037976
std 0.000231 0.001767 0.000448 0.000762
min 0.011673 0.032201 0.018394 0.036820
25% 0.011683 0.034430 0.018865 0.037684
50% 0.011844 0.035556 0.019171 0.038209
75% 0.012050 0.036176 0.019422 0.038356
max 0.012200 0.036623 0.019481 0.038813
%load_ext autoreload
%autoreload 2
import models
model,history=models.improved_lstm_model(train_X,train_y,validation_X,validation_y)
The autoreload extension is already loaded. To reload it, use: %reload_ext autoreload Train on 550 samples, validate on 53 samples Epoch 1/500 - 13s - loss: 0.8709 - val_loss: 0.6445 Epoch 2/500 - 0s - loss: 0.8277 - val_loss: 0.5941 Epoch 3/500 - 0s - loss: 0.7821 - val_loss: 0.5438 Epoch 4/500 - 0s - loss: 0.7363 - val_loss: 0.5107 Epoch 5/500 - 0s - loss: 0.6952 - val_loss: 0.5194 Epoch 6/500 - 0s - loss: 0.6672 - val_loss: 0.5454 Epoch 7/500 - 0s - loss: 0.6508 - val_loss: 0.5646 Epoch 8/500 - 0s - loss: 0.6398 - val_loss: 0.5735 Epoch 9/500 - 0s - loss: 0.6315 - val_loss: 0.5752 Epoch 10/500 - 0s - loss: 0.6242 - val_loss: 0.5720 Epoch 11/500 - 0s - loss: 0.6175 - val_loss: 0.5657 Epoch 12/500 - 0s - loss: 0.6111 - val_loss: 0.5578 Epoch 13/500 - 0s - loss: 0.6050 - val_loss: 0.5490 Epoch 14/500 - 0s - loss: 0.5989 - val_loss: 0.5401 Epoch 15/500 - 0s - loss: 0.5930 - val_loss: 0.5314 Epoch 16/500 - 0s - loss: 0.5870 - val_loss: 0.5228 Epoch 17/500 - 0s - loss: 0.5811 - val_loss: 0.5146 Epoch 18/500 - 0s - loss: 0.5751 - val_loss: 0.5069 Epoch 19/500 - 0s - loss: 0.5691 - val_loss: 0.4998 Epoch 20/500 - 0s - loss: 0.5631 - val_loss: 0.4931 Epoch 21/500 - 0s - loss: 0.5570 - val_loss: 0.4867 Epoch 22/500 - 0s - loss: 0.5508 - val_loss: 0.4805 Epoch 23/500 - 0s - loss: 0.5447 - val_loss: 0.4744 Epoch 24/500 - 0s - loss: 0.5385 - val_loss: 0.4683 Epoch 25/500 - 0s - loss: 0.5323 - val_loss: 0.4621 Epoch 26/500 - 0s - loss: 0.5261 - val_loss: 0.4560 Epoch 27/500 - 0s - loss: 0.5199 - val_loss: 0.4498 Epoch 28/500 - 0s - loss: 0.5137 - val_loss: 0.4436 Epoch 29/500 - 0s - loss: 0.5075 - val_loss: 0.4374 Epoch 30/500 - 0s - loss: 0.5013 - val_loss: 0.4314 Epoch 31/500 - 0s - loss: 0.4950 - val_loss: 0.4256 Epoch 32/500 - 0s - loss: 0.4887 - val_loss: 0.4198 Epoch 33/500 - 0s - loss: 0.4823 - val_loss: 0.4140 Epoch 34/500 - 0s - loss: 0.4760 - val_loss: 0.4082 Epoch 35/500 - 0s - loss: 0.4695 - val_loss: 0.4024 Epoch 36/500 - 0s - loss: 0.4631 - val_loss: 0.3967 Epoch 37/500 - 0s - loss: 0.4566 - val_loss: 0.3909 Epoch 38/500 - 0s - loss: 0.4501 - val_loss: 0.3850 Epoch 39/500 - 0s - loss: 0.4435 - val_loss: 0.3790 Epoch 40/500 - 0s - loss: 0.4370 - val_loss: 0.3728 Epoch 41/500 - 0s - loss: 0.4305 - val_loss: 0.3665 Epoch 42/500 - 0s - loss: 0.4239 - val_loss: 0.3604 Epoch 43/500 - 0s - loss: 0.4173 - val_loss: 0.3545 Epoch 44/500 - 0s - loss: 0.4106 - val_loss: 0.3494 Epoch 45/500 - 0s - loss: 0.4037 - val_loss: 0.3445 Epoch 46/500 - 0s - loss: 0.3968 - val_loss: 0.3397 Epoch 47/500 - 0s - loss: 0.3899 - val_loss: 0.3344 Epoch 48/500 - 0s - loss: 0.3831 - val_loss: 0.3291 Epoch 49/500 - 0s - loss: 0.3762 - val_loss: 0.3238 Epoch 50/500 - 0s - loss: 0.3695 - val_loss: 0.3186 Epoch 51/500 - 0s - loss: 0.3628 - val_loss: 0.3132 Epoch 52/500 - 0s - loss: 0.3564 - val_loss: 0.3083 Epoch 53/500 - 0s - loss: 0.3501 - val_loss: 0.3039 Epoch 54/500 - 0s - loss: 0.3439 - val_loss: 0.2999 Epoch 55/500 - 0s - loss: 0.3377 - val_loss: 0.2965 Epoch 56/500 - 0s - loss: 0.3316 - val_loss: 0.2929 Epoch 57/500 - 0s - loss: 0.3259 - val_loss: 0.2895 Epoch 58/500 - 0s - loss: 0.3204 - val_loss: 0.2862 Epoch 59/500 - 0s - loss: 0.3151 - val_loss: 0.2832 Epoch 60/500 - 0s - loss: 0.3099 - val_loss: 0.2804 Epoch 61/500 - 0s - loss: 0.3048 - val_loss: 0.2777 Epoch 62/500 - 0s - loss: 0.2997 - val_loss: 0.2747 Epoch 63/500 - 0s - loss: 0.2948 - val_loss: 0.2717 Epoch 64/500 - 0s - loss: 0.2903 - val_loss: 0.2692 Epoch 65/500 - 0s - loss: 0.2860 - val_loss: 0.2670 Epoch 66/500 - 0s - loss: 0.2816 - val_loss: 0.2640 Epoch 67/500 - 0s - loss: 0.2775 - val_loss: 0.2608 Epoch 68/500 - 0s - loss: 0.2736 - val_loss: 0.2577 Epoch 69/500 - 0s - loss: 0.2698 - val_loss: 0.2544 Epoch 70/500 - 0s - loss: 0.2662 - val_loss: 0.2512 Epoch 71/500 - 0s - loss: 0.2627 - val_loss: 0.2481 Epoch 72/500 - 0s - loss: 0.2592 - val_loss: 0.2450 Epoch 73/500 - 0s - loss: 0.2559 - val_loss: 0.2416 Epoch 74/500 - 0s - loss: 0.2527 - val_loss: 0.2384 Epoch 75/500 - 0s - loss: 0.2496 - val_loss: 0.2357 Epoch 76/500 - 0s - loss: 0.2465 - val_loss: 0.2329 Epoch 77/500 - 0s - loss: 0.2434 - val_loss: 0.2297 Epoch 78/500 - 0s - loss: 0.2405 - val_loss: 0.2269 Epoch 79/500 - 0s - loss: 0.2376 - val_loss: 0.2240 Epoch 80/500 - 0s - loss: 0.2347 - val_loss: 0.2208 Epoch 81/500 - 0s - loss: 0.2320 - val_loss: 0.2179 Epoch 82/500 - 0s - loss: 0.2292 - val_loss: 0.2152 Epoch 83/500 - 0s - loss: 0.2266 - val_loss: 0.2124 Epoch 84/500 - 0s - loss: 0.2239 - val_loss: 0.2094 Epoch 85/500 - 0s - loss: 0.2213 - val_loss: 0.2067 Epoch 86/500 - 0s - loss: 0.2188 - val_loss: 0.2042 Epoch 87/500 - 0s - loss: 0.2162 - val_loss: 0.2014 Epoch 88/500 - 0s - loss: 0.2138 - val_loss: 0.1988 Epoch 89/500 - 0s - loss: 0.2113 - val_loss: 0.1963 Epoch 90/500 - 0s - loss: 0.2089 - val_loss: 0.1937 Epoch 91/500 - 0s - loss: 0.2065 - val_loss: 0.1911 Epoch 92/500 - 0s - loss: 0.2041 - val_loss: 0.1889 Epoch 93/500 - 0s - loss: 0.2018 - val_loss: 0.1864 Epoch 94/500 - 0s - loss: 0.1995 - val_loss: 0.1837 Epoch 95/500 - 0s - loss: 0.1972 - val_loss: 0.1813 Epoch 96/500 - 0s - loss: 0.1949 - val_loss: 0.1792 Epoch 97/500 - 0s - loss: 0.1927 - val_loss: 0.1767 Epoch 98/500 - 0s - loss: 0.1905 - val_loss: 0.1744 Epoch 99/500 - 0s - loss: 0.1883 - val_loss: 0.1723 Epoch 100/500 - 0s - loss: 0.1862 - val_loss: 0.1700 Epoch 101/500 - 0s - loss: 0.1841 - val_loss: 0.1678 Epoch 102/500 - 0s - loss: 0.1820 - val_loss: 0.1657 Epoch 103/500 - 0s - loss: 0.1799 - val_loss: 0.1636 Epoch 104/500 - 0s - loss: 0.1778 - val_loss: 0.1614 Epoch 105/500 - 0s - loss: 0.1758 - val_loss: 0.1594 Epoch 106/500 - 0s - loss: 0.1738 - val_loss: 0.1574 Epoch 107/500 - 0s - loss: 0.1718 - val_loss: 0.1553 Epoch 108/500 - 0s - loss: 0.1699 - val_loss: 0.1534 Epoch 109/500 - 0s - loss: 0.1679 - val_loss: 0.1515 Epoch 110/500 - 0s - loss: 0.1660 - val_loss: 0.1494 Epoch 111/500 - 0s - loss: 0.1641 - val_loss: 0.1474 Epoch 112/500 - 0s - loss: 0.1622 - val_loss: 0.1456 Epoch 113/500 - 0s - loss: 0.1604 - val_loss: 0.1438 Epoch 114/500 - 0s - loss: 0.1585 - val_loss: 0.1419 Epoch 115/500 - 0s - loss: 0.1567 - val_loss: 0.1400 Epoch 116/500 - 0s - loss: 0.1549 - val_loss: 0.1383 Epoch 117/500 - 0s - loss: 0.1531 - val_loss: 0.1365 Epoch 118/500 - 0s - loss: 0.1514 - val_loss: 0.1347 Epoch 119/500 - 0s - loss: 0.1496 - val_loss: 0.1330 Epoch 120/500 - 0s - loss: 0.1479 - val_loss: 0.1313 Epoch 121/500 - 0s - loss: 0.1462 - val_loss: 0.1296 Epoch 122/500 - 0s - loss: 0.1445 - val_loss: 0.1279 Epoch 123/500 - 0s - loss: 0.1429 - val_loss: 0.1262 Epoch 124/500 - 0s - loss: 0.1412 - val_loss: 0.1245 Epoch 125/500 - 0s - loss: 0.1396 - val_loss: 0.1229 Epoch 126/500 - 0s - loss: 0.1380 - val_loss: 0.1213 Epoch 127/500 - 0s - loss: 0.1364 - val_loss: 0.1197 Epoch 128/500 - 0s - loss: 0.1348 - val_loss: 0.1181 Epoch 129/500 - 0s - loss: 0.1333 - val_loss: 0.1166 Epoch 130/500 - 0s - loss: 0.1317 - val_loss: 0.1150 Epoch 131/500 - 0s - loss: 0.1302 - val_loss: 0.1135 Epoch 132/500 - 0s - loss: 0.1287 - val_loss: 0.1120 Epoch 133/500 - 0s - loss: 0.1272 - val_loss: 0.1106 Epoch 134/500 - 0s - loss: 0.1258 - val_loss: 0.1091 Epoch 135/500 - 0s - loss: 0.1243 - val_loss: 0.1077 Epoch 136/500 - 0s - loss: 0.1229 - val_loss: 0.1062 Epoch 137/500 - 0s - loss: 0.1215 - val_loss: 0.1048 Epoch 138/500 - 0s - loss: 0.1201 - val_loss: 0.1035 Epoch 139/500 - 0s - loss: 0.1187 - val_loss: 0.1022 Epoch 140/500 - 0s - loss: 0.1173 - val_loss: 0.1007 Epoch 141/500 - 0s - loss: 0.1160 - val_loss: 0.0995 Epoch 142/500 - 0s - loss: 0.1146 - val_loss: 0.0981 Epoch 143/500 - 0s - loss: 0.1133 - val_loss: 0.0967 Epoch 144/500 - 0s - loss: 0.1120 - val_loss: 0.0955 Epoch 145/500 - 0s - loss: 0.1107 - val_loss: 0.0942 Epoch 146/500 - 0s - loss: 0.1095 - val_loss: 0.0930 Epoch 147/500 - 0s - loss: 0.1082 - val_loss: 0.0918 Epoch 148/500 - 0s - loss: 0.1070 - val_loss: 0.0905 Epoch 149/500 - 0s - loss: 0.1057 - val_loss: 0.0894 Epoch 150/500 - 0s - loss: 0.1045 - val_loss: 0.0881 Epoch 151/500 - 0s - loss: 0.1033 - val_loss: 0.0870 Epoch 152/500 - 0s - loss: 0.1021 - val_loss: 0.0857 Epoch 153/500 - 0s - loss: 0.1010 - val_loss: 0.0847 Epoch 154/500 - 0s - loss: 0.0998 - val_loss: 0.0835 Epoch 155/500 - 0s - loss: 0.0987 - val_loss: 0.0824 Epoch 156/500 - 0s - loss: 0.0975 - val_loss: 0.0814 Epoch 157/500 - 0s - loss: 0.0964 - val_loss: 0.0801 Epoch 158/500 - 0s - loss: 0.0953 - val_loss: 0.0792 Epoch 159/500 - 0s - loss: 0.0943 - val_loss: 0.0782 Epoch 160/500 - 0s - loss: 0.0932 - val_loss: 0.0769 Epoch 161/500 - 0s - loss: 0.0921 - val_loss: 0.0761 Epoch 162/500 - 0s - loss: 0.0911 - val_loss: 0.0751 Epoch 163/500 - 0s - loss: 0.0901 - val_loss: 0.0739 Epoch 164/500 - 0s - loss: 0.0890 - val_loss: 0.0730 Epoch 165/500 - 0s - loss: 0.0880 - val_loss: 0.0720 Epoch 166/500 - 0s - loss: 0.0870 - val_loss: 0.0711 Epoch 167/500 - 0s - loss: 0.0861 - val_loss: 0.0701 Epoch 168/500 - 0s - loss: 0.0851 - val_loss: 0.0692 Epoch 169/500 - 0s - loss: 0.0842 - val_loss: 0.0681 Epoch 170/500 - 0s - loss: 0.0832 - val_loss: 0.0674 Epoch 171/500 - 0s - loss: 0.0823 - val_loss: 0.0665 Epoch 172/500 - 0s - loss: 0.0814 - val_loss: 0.0655 Epoch 173/500 - 0s - loss: 0.0804 - val_loss: 0.0646 Epoch 174/500 - 0s - loss: 0.0796 - val_loss: 0.0638 Epoch 175/500 - 0s - loss: 0.0787 - val_loss: 0.0628 Epoch 176/500 - 0s - loss: 0.0778 - val_loss: 0.0621 Epoch 177/500 - 0s - loss: 0.0770 - val_loss: 0.0613 Epoch 178/500 - 0s - loss: 0.0761 - val_loss: 0.0604 Epoch 179/500 - 0s - loss: 0.0753 - val_loss: 0.0596 Epoch 180/500 - 0s - loss: 0.0744 - val_loss: 0.0588 Epoch 181/500 - 0s - loss: 0.0736 - val_loss: 0.0580 Epoch 182/500 - 0s - loss: 0.0728 - val_loss: 0.0573 Epoch 183/500 - 0s - loss: 0.0720 - val_loss: 0.0565 Epoch 184/500 - 0s - loss: 0.0713 - val_loss: 0.0557 Epoch 185/500 - 0s - loss: 0.0705 - val_loss: 0.0550 Epoch 186/500 - 0s - loss: 0.0697 - val_loss: 0.0542 Epoch 187/500 - 0s - loss: 0.0690 - val_loss: 0.0535 Epoch 188/500 - 0s - loss: 0.0683 - val_loss: 0.0528 Epoch 189/500 - 0s - loss: 0.0675 - val_loss: 0.0521 Epoch 190/500 - 0s - loss: 0.0668 - val_loss: 0.0514 Epoch 191/500 - 0s - loss: 0.0661 - val_loss: 0.0508 Epoch 192/500 - 0s - loss: 0.0654 - val_loss: 0.0500 Epoch 193/500 - 0s - loss: 0.0647 - val_loss: 0.0494 Epoch 194/500 - 0s - loss: 0.0641 - val_loss: 0.0488 Epoch 195/500 - 0s - loss: 0.0634 - val_loss: 0.0481 Epoch 196/500 - 0s - loss: 0.0627 - val_loss: 0.0475 Epoch 197/500 - 0s - loss: 0.0621 - val_loss: 0.0468 Epoch 198/500 - 0s - loss: 0.0615 - val_loss: 0.0462 Epoch 199/500 - 0s - loss: 0.0608 - val_loss: 0.0456 Epoch 200/500 - 0s - loss: 0.0602 - val_loss: 0.0451 Epoch 201/500 - 0s - loss: 0.0596 - val_loss: 0.0444 Epoch 202/500 - 0s - loss: 0.0590 - val_loss: 0.0438 Epoch 203/500 - 0s - loss: 0.0584 - val_loss: 0.0434 Epoch 204/500 - 0s - loss: 0.0578 - val_loss: 0.0426 Epoch 205/500 - 0s - loss: 0.0573 - val_loss: 0.0420 Epoch 206/500 - 0s - loss: 0.0567 - val_loss: 0.0419 Epoch 207/500 - 0s - loss: 0.0561 - val_loss: 0.0412 Epoch 208/500 - 0s - loss: 0.0556 - val_loss: 0.0404 Epoch 209/500 - 0s - loss: 0.0550 - val_loss: 0.0401 Epoch 210/500 - 0s - loss: 0.0545 - val_loss: 0.0397 Epoch 211/500 - 0s - loss: 0.0540 - val_loss: 0.0390 Epoch 212/500 - 0s - loss: 0.0535 - val_loss: 0.0384 Epoch 213/500 - 0s - loss: 0.0529 - val_loss: 0.0382 Epoch 214/500 - 0s - loss: 0.0525 - val_loss: 0.0375 Epoch 215/500 - 0s - loss: 0.0520 - val_loss: 0.0371 Epoch 216/500 - 0s - loss: 0.0515 - val_loss: 0.0367 Epoch 217/500 - 0s - loss: 0.0510 - val_loss: 0.0361 Epoch 218/500 - 0s - loss: 0.0505 - val_loss: 0.0357 Epoch 219/500 - 0s - loss: 0.0500 - val_loss: 0.0354 Epoch 220/500 - 0s - loss: 0.0496 - val_loss: 0.0348 Epoch 221/500 - 0s - loss: 0.0491 - val_loss: 0.0343 Epoch 222/500 - 0s - loss: 0.0487 - val_loss: 0.0339 Epoch 223/500 - 0s - loss: 0.0483 - val_loss: 0.0335 Epoch 224/500 - 0s - loss: 0.0478 - val_loss: 0.0330 Epoch 225/500 - 0s - loss: 0.0474 - val_loss: 0.0328 Epoch 226/500 - 0s - loss: 0.0470 - val_loss: 0.0322 Epoch 227/500 - 0s - loss: 0.0466 - val_loss: 0.0319 Epoch 228/500 - 0s - loss: 0.0462 - val_loss: 0.0316 Epoch 229/500 - 0s - loss: 0.0458 - val_loss: 0.0312 Epoch 230/500 - 0s - loss: 0.0454 - val_loss: 0.0308 Epoch 231/500 - 0s - loss: 0.0450 - val_loss: 0.0304 Epoch 232/500 - 0s - loss: 0.0446 - val_loss: 0.0301 Epoch 233/500 - 0s - loss: 0.0443 - val_loss: 0.0298 Epoch 234/500 - 0s - loss: 0.0439 - val_loss: 0.0293 Epoch 235/500 - 0s - loss: 0.0435 - val_loss: 0.0293 Epoch 236/500 - 0s - loss: 0.0432 - val_loss: 0.0285 Epoch 237/500 - 0s - loss: 0.0429 - val_loss: 0.0287 Epoch 238/500 - 0s - loss: 0.0425 - val_loss: 0.0280 Epoch 239/500 - 0s - loss: 0.0422 - val_loss: 0.0279 Epoch 240/500 - 0s - loss: 0.0419 - val_loss: 0.0273 Epoch 241/500 - 0s - loss: 0.0415 - val_loss: 0.0274 Epoch 242/500 - 0s - loss: 0.0412 - val_loss: 0.0268 Epoch 243/500 - 0s - loss: 0.0409 - val_loss: 0.0267 Epoch 244/500 - 0s - loss: 0.0406 - val_loss: 0.0262 Epoch 245/500 - 0s - loss: 0.0403 - val_loss: 0.0262 Epoch 246/500 - 0s - loss: 0.0400 - val_loss: 0.0257 Epoch 247/500 - 0s - loss: 0.0397 - val_loss: 0.0257 Epoch 248/500 - 0s - loss: 0.0394 - val_loss: 0.0249 Epoch 249/500 - 0s - loss: 0.0391 - val_loss: 0.0251 Epoch 250/500 - 0s - loss: 0.0389 - val_loss: 0.0245 Epoch 251/500 - 0s - loss: 0.0386 - val_loss: 0.0245 Epoch 252/500 - 0s - loss: 0.0383 - val_loss: 0.0240 Epoch 253/500 - 0s - loss: 0.0381 - val_loss: 0.0240 Epoch 254/500 - 0s - loss: 0.0378 - val_loss: 0.0236 Epoch 255/500 - 0s - loss: 0.0376 - val_loss: 0.0235 Epoch 256/500 - 0s - loss: 0.0373 - val_loss: 0.0231 Epoch 257/500 - 0s - loss: 0.0371 - val_loss: 0.0231 Epoch 258/500 - 0s - loss: 0.0368 - val_loss: 0.0226 Epoch 259/500 - 0s - loss: 0.0366 - val_loss: 0.0225 Epoch 260/500 - 0s - loss: 0.0364 - val_loss: 0.0221 Epoch 261/500 - 0s - loss: 0.0361 - val_loss: 0.0221 Epoch 262/500 - 0s - loss: 0.0359 - val_loss: 0.0217 Epoch 263/500 - 0s - loss: 0.0357 - val_loss: 0.0217 Epoch 264/500 - 0s - loss: 0.0355 - val_loss: 0.0213 Epoch 265/500 - 0s - loss: 0.0353 - val_loss: 0.0213 Epoch 266/500 - 0s - loss: 0.0351 - val_loss: 0.0209 Epoch 267/500 - 0s - loss: 0.0349 - val_loss: 0.0209 Epoch 268/500 - 0s - loss: 0.0347 - val_loss: 0.0206 Epoch 269/500 - 0s - loss: 0.0345 - val_loss: 0.0205 Epoch 270/500 - 0s - loss: 0.0343 - val_loss: 0.0203 Epoch 271/500 - 0s - loss: 0.0341 - val_loss: 0.0202 Epoch 272/500 - 0s - loss: 0.0339 - val_loss: 0.0199 Epoch 273/500 - 0s - loss: 0.0337 - val_loss: 0.0198 Epoch 274/500 - 0s - loss: 0.0335 - val_loss: 0.0196 Epoch 275/500 - 0s - loss: 0.0334 - val_loss: 0.0195 Epoch 276/500 - 0s - loss: 0.0332 - val_loss: 0.0192 Epoch 277/500 - 0s - loss: 0.0330 - val_loss: 0.0192 Epoch 278/500 - 0s - loss: 0.0329 - val_loss: 0.0190 Epoch 279/500 - 0s - loss: 0.0327 - val_loss: 0.0188 Epoch 280/500 - 0s - loss: 0.0326 - val_loss: 0.0186 Epoch 281/500 - 0s - loss: 0.0324 - val_loss: 0.0185 Epoch 282/500 - 0s - loss: 0.0322 - val_loss: 0.0184 Epoch 283/500 - 0s - loss: 0.0321 - val_loss: 0.0181 Epoch 284/500 - 0s - loss: 0.0319 - val_loss: 0.0182 Epoch 285/500 - 0s - loss: 0.0318 - val_loss: 0.0179 Epoch 286/500 - 0s - loss: 0.0317 - val_loss: 0.0178 Epoch 287/500 - 0s - loss: 0.0315 - val_loss: 0.0176 Epoch 288/500 - 0s - loss: 0.0314 - val_loss: 0.0175 Epoch 289/500 - 0s - loss: 0.0313 - val_loss: 0.0173 Epoch 290/500 - 0s - loss: 0.0311 - val_loss: 0.0173 Epoch 291/500 - 0s - loss: 0.0310 - val_loss: 0.0171 Epoch 292/500 - 0s - loss: 0.0309 - val_loss: 0.0170 Epoch 293/500 - 0s - loss: 0.0308 - val_loss: 0.0169 Epoch 294/500 - 0s - loss: 0.0306 - val_loss: 0.0168 Epoch 295/500 - 0s - loss: 0.0305 - val_loss: 0.0167 Epoch 296/500 - 0s - loss: 0.0304 - val_loss: 0.0166 Epoch 297/500 - 0s - loss: 0.0303 - val_loss: 0.0165 Epoch 298/500 - 0s - loss: 0.0302 - val_loss: 0.0164 Epoch 299/500 - 0s - loss: 0.0301 - val_loss: 0.0162 Epoch 300/500 - 0s - loss: 0.0300 - val_loss: 0.0162 Epoch 301/500 - 0s - loss: 0.0299 - val_loss: 0.0161 Epoch 302/500 - 0s - loss: 0.0298 - val_loss: 0.0160 Epoch 303/500 - 0s - loss: 0.0297 - val_loss: 0.0159 Epoch 304/500 - 0s - loss: 0.0296 - val_loss: 0.0159 Epoch 305/500 - 0s - loss: 0.0295 - val_loss: 0.0158 Epoch 306/500 - 0s - loss: 0.0294 - val_loss: 0.0157 Epoch 307/500 - 0s - loss: 0.0293 - val_loss: 0.0156 Epoch 308/500 - 0s - loss: 0.0292 - val_loss: 0.0155 Epoch 309/500 - 0s - loss: 0.0291 - val_loss: 0.0154 Epoch 310/500 - 0s - loss: 0.0290 - val_loss: 0.0154 Epoch 311/500 - 0s - loss: 0.0290 - val_loss: 0.0153 Epoch 312/500 - 0s - loss: 0.0289 - val_loss: 0.0152 Epoch 313/500 - 0s - loss: 0.0288 - val_loss: 0.0151 Epoch 314/500 - 0s - loss: 0.0287 - val_loss: 0.0150 Epoch 315/500 - 0s - loss: 0.0286 - val_loss: 0.0150 Epoch 316/500 - 0s - loss: 0.0286 - val_loss: 0.0148 Epoch 317/500 - 0s - loss: 0.0285 - val_loss: 0.0149 Epoch 318/500 - 0s - loss: 0.0284 - val_loss: 0.0147 Epoch 319/500 - 0s - loss: 0.0283 - val_loss: 0.0148 Epoch 320/500 - 0s - loss: 0.0283 - val_loss: 0.0148 Epoch 321/500 - 0s - loss: 0.0282 - val_loss: 0.0146 Epoch 322/500 - 0s - loss: 0.0282 - val_loss: 0.0145 Epoch 323/500 - 0s - loss: 0.0281 - val_loss: 0.0144 Epoch 324/500 - 0s - loss: 0.0280 - val_loss: 0.0145 Epoch 325/500 - 0s - loss: 0.0279 - val_loss: 0.0144 Epoch 326/500 - 0s - loss: 0.0279 - val_loss: 0.0143 Epoch 327/500 - 0s - loss: 0.0278 - val_loss: 0.0142 Epoch 328/500 - 0s - loss: 0.0278 - val_loss: 0.0143 Epoch 329/500 - 0s - loss: 0.0277 - val_loss: 0.0142 Epoch 330/500 - 0s - loss: 0.0277 - val_loss: 0.0144 Epoch 331/500 - 0s - loss: 0.0276 - val_loss: 0.0140 Epoch 332/500 - 0s - loss: 0.0275 - val_loss: 0.0142 Epoch 333/500 - 0s - loss: 0.0275 - val_loss: 0.0139 Epoch 334/500 - 0s - loss: 0.0275 - val_loss: 0.0140 Epoch 335/500 - 0s - loss: 0.0274 - val_loss: 0.0140 Epoch 336/500 - 0s - loss: 0.0273 - val_loss: 0.0141 Epoch 337/500 - 0s - loss: 0.0273 - val_loss: 0.0138 Epoch 338/500 - 0s - loss: 0.0273 - val_loss: 0.0140 Epoch 339/500 - 0s - loss: 0.0272 - val_loss: 0.0136 Epoch 340/500 - 0s - loss: 0.0272 - val_loss: 0.0139 Epoch 341/500 - 0s - loss: 0.0271 - val_loss: 0.0136 Epoch 342/500 - 0s - loss: 0.0271 - val_loss: 0.0140 Epoch 343/500 - 0s - loss: 0.0271 - val_loss: 0.0134 Epoch 344/500 - 0s - loss: 0.0270 - val_loss: 0.0137 Epoch 345/500 - 0s - loss: 0.0270 - val_loss: 0.0135 Epoch 346/500 - 0s - loss: 0.0269 - val_loss: 0.0139 Epoch 347/500 - 0s - loss: 0.0269 - val_loss: 0.0134 Epoch 348/500 - 0s - loss: 0.0269 - val_loss: 0.0139 Epoch 349/500 - 0s - loss: 0.0269 - val_loss: 0.0133 Epoch 350/500 - 0s - loss: 0.0268 - val_loss: 0.0135 Epoch 351/500 - 0s - loss: 0.0268 - val_loss: 0.0134 Epoch 352/500 - 0s - loss: 0.0267 - val_loss: 0.0136 Epoch 353/500 - 0s - loss: 0.0267 - val_loss: 0.0131 Epoch 354/500 - 0s - loss: 0.0267 - val_loss: 0.0135 Epoch 355/500 - 0s - loss: 0.0267 - val_loss: 0.0131 Epoch 356/500 - 0s - loss: 0.0266 - val_loss: 0.0138 Epoch 357/500 - 0s - loss: 0.0267 - val_loss: 0.0129 Epoch 358/500 - 0s - loss: 0.0266 - val_loss: 0.0137 Epoch 359/500 - 0s - loss: 0.0266 - val_loss: 0.0130 Epoch 360/500 - 0s - loss: 0.0266 - val_loss: 0.0136 Epoch 361/500 - 0s - loss: 0.0266 - val_loss: 0.0130 Epoch 362/500 - 0s - loss: 0.0267 - val_loss: 0.0133 Epoch 363/500 - 0s - loss: 0.0266 - val_loss: 0.0129 Epoch 364/500 - 0s - loss: 0.0266 - val_loss: 0.0129 Epoch 365/500 - 0s - loss: 0.0266 - val_loss: 0.0128 Epoch 366/500 - 0s - loss: 0.0266 - val_loss: 0.0127 Epoch 367/500 - 0s - loss: 0.0265 - val_loss: 0.0127 Epoch 368/500 - 0s - loss: 0.0265 - val_loss: 0.0125 Epoch 369/500 - 0s - loss: 0.0264 - val_loss: 0.0126 Epoch 370/500 - 0s - loss: 0.0264 - val_loss: 0.0125 Epoch 371/500 - 0s - loss: 0.0264 - val_loss: 0.0126 Epoch 372/500 - 0s - loss: 0.0263 - val_loss: 0.0125 Epoch 373/500 - 0s - loss: 0.0262 - val_loss: 0.0126 Epoch 374/500 - 0s - loss: 0.0262 - val_loss: 0.0126 Epoch 375/500 - 0s - loss: 0.0262 - val_loss: 0.0127 Epoch 376/500 - 0s - loss: 0.0262 - val_loss: 0.0127 Epoch 377/500 - 0s - loss: 0.0261 - val_loss: 0.0128 Epoch 378/500 - 0s - loss: 0.0261 - val_loss: 0.0127 Epoch 379/500 - 0s - loss: 0.0261 - val_loss: 0.0129 Epoch 380/500 - 0s - loss: 0.0261 - val_loss: 0.0128 Epoch 381/500 - 0s - loss: 0.0261 - val_loss: 0.0130 Epoch 382/500 - 0s - loss: 0.0260 - val_loss: 0.0126 Epoch 383/500 - 0s - loss: 0.0260 - val_loss: 0.0128 Epoch 384/500 - 0s - loss: 0.0260 - val_loss: 0.0127 Epoch 385/500 - 0s - loss: 0.0260 - val_loss: 0.0129 Epoch 386/500 - 0s - loss: 0.0260 - val_loss: 0.0127 Epoch 387/500 - 0s - loss: 0.0259 - val_loss: 0.0127 Epoch 388/500 - 0s - loss: 0.0260 - val_loss: 0.0127 Epoch 389/500 - 0s - loss: 0.0260 - val_loss: 0.0126 Epoch 390/500 - 0s - loss: 0.0259 - val_loss: 0.0128 Epoch 391/500 - 0s - loss: 0.0259 - val_loss: 0.0125 Epoch 392/500 - 0s - loss: 0.0259 - val_loss: 0.0129 Epoch 393/500 - 0s - loss: 0.0259 - val_loss: 0.0123 Epoch 394/500 - 0s - loss: 0.0259 - val_loss: 0.0127 Epoch 395/500 - 0s - loss: 0.0259 - val_loss: 0.0124 Epoch 396/500 - 0s - loss: 0.0259 - val_loss: 0.0130 Epoch 397/500 - 0s - loss: 0.0259 - val_loss: 0.0123 Epoch 398/500 - 0s - loss: 0.0258 - val_loss: 0.0127 Epoch 399/500 - 0s - loss: 0.0259 - val_loss: 0.0124 Epoch 400/500 - 0s - loss: 0.0258 - val_loss: 0.0129 Epoch 401/500 - 0s - loss: 0.0258 - val_loss: 0.0123 Epoch 402/500 - 0s - loss: 0.0258 - val_loss: 0.0126 Epoch 403/500 - 0s - loss: 0.0258 - val_loss: 0.0123 Epoch 404/500 - 0s - loss: 0.0258 - val_loss: 0.0127 Epoch 405/500 - 0s - loss: 0.0258 - val_loss: 0.0122 Epoch 406/500 - 0s - loss: 0.0258 - val_loss: 0.0125 Epoch 407/500 - 0s - loss: 0.0257 - val_loss: 0.0123 Epoch 408/500 - 0s - loss: 0.0258 - val_loss: 0.0127 Epoch 409/500 - 0s - loss: 0.0258 - val_loss: 0.0122 Epoch 410/500 - 0s - loss: 0.0257 - val_loss: 0.0126 Epoch 411/500 - 0s - loss: 0.0257 - val_loss: 0.0122 Epoch 412/500 - 0s - loss: 0.0257 - val_loss: 0.0127 Epoch 413/500 - 0s - loss: 0.0258 - val_loss: 0.0121 Epoch 414/500 - 0s - loss: 0.0257 - val_loss: 0.0125 Epoch 415/500 - 0s - loss: 0.0257 - val_loss: 0.0122 Epoch 416/500 - 0s - loss: 0.0257 - val_loss: 0.0126 Epoch 417/500 - 0s - loss: 0.0257 - val_loss: 0.0123 Epoch 418/500 - 0s - loss: 0.0257 - val_loss: 0.0124 Epoch 419/500 - 0s - loss: 0.0257 - val_loss: 0.0122 Epoch 420/500 - 0s - loss: 0.0257 - val_loss: 0.0124 Epoch 421/500 - 0s - loss: 0.0257 - val_loss: 0.0121 Epoch 422/500 - 0s - loss: 0.0257 - val_loss: 0.0126 Epoch 423/500 - 0s - loss: 0.0257 - val_loss: 0.0121 Epoch 424/500 - 0s - loss: 0.0257 - val_loss: 0.0126 Epoch 425/500 - 0s - loss: 0.0256 - val_loss: 0.0122 Epoch 426/500 - 0s - loss: 0.0257 - val_loss: 0.0127 Epoch 427/500 - 0s - loss: 0.0257 - val_loss: 0.0120 Epoch 428/500 - 0s - loss: 0.0257 - val_loss: 0.0125 Epoch 429/500 - 0s - loss: 0.0257 - val_loss: 0.0120 Epoch 430/500 - 0s - loss: 0.0257 - val_loss: 0.0126 Epoch 431/500 - 0s - loss: 0.0257 - val_loss: 0.0120 Epoch 432/500 - 0s - loss: 0.0257 - val_loss: 0.0127 Epoch 433/500 - 0s - loss: 0.0257 - val_loss: 0.0119 Epoch 434/500 - 0s - loss: 0.0256 - val_loss: 0.0125 Epoch 435/500 - 0s - loss: 0.0256 - val_loss: 0.0120 Epoch 436/500 - 0s - loss: 0.0256 - val_loss: 0.0123 Epoch 437/500 - 0s - loss: 0.0256 - val_loss: 0.0120 Epoch 438/500 - 0s - loss: 0.0256 - val_loss: 0.0124 Epoch 439/500 - 0s - loss: 0.0256 - val_loss: 0.0121 Epoch 440/500 - 0s - loss: 0.0256 - val_loss: 0.0124 Epoch 441/500 - 0s - loss: 0.0256 - val_loss: 0.0119 Epoch 442/500 - 0s - loss: 0.0256 - val_loss: 0.0127 Epoch 443/500 - 0s - loss: 0.0257 - val_loss: 0.0119 Epoch 444/500 - 0s - loss: 0.0255 - val_loss: 0.0126 Epoch 445/500 - 0s - loss: 0.0255 - val_loss: 0.0120 Epoch 446/500 - 0s - loss: 0.0256 - val_loss: 0.0123 Epoch 447/500 - 0s - loss: 0.0256 - val_loss: 0.0119 Epoch 448/500 - 0s - loss: 0.0255 - val_loss: 0.0123 Epoch 449/500 - 0s - loss: 0.0255 - val_loss: 0.0120 Epoch 450/500 - 0s - loss: 0.0256 - val_loss: 0.0123 Epoch 451/500 - 0s - loss: 0.0255 - val_loss: 0.0120 Epoch 452/500 - 0s - loss: 0.0255 - val_loss: 0.0125 Epoch 453/500 - 0s - loss: 0.0255 - val_loss: 0.0121 Epoch 454/500 - 0s - loss: 0.0255 - val_loss: 0.0123 Epoch 455/500 - 0s - loss: 0.0255 - val_loss: 0.0120 Epoch 456/500 - 0s - loss: 0.0255 - val_loss: 0.0123 Epoch 457/500 - 0s - loss: 0.0255 - val_loss: 0.0120 Epoch 458/500 - 0s - loss: 0.0255 - val_loss: 0.0123 Epoch 459/500 - 0s - loss: 0.0254 - val_loss: 0.0120 Epoch 460/500 - 0s - loss: 0.0255 - val_loss: 0.0123 Epoch 461/500 - 0s - loss: 0.0255 - val_loss: 0.0120 Epoch 462/500 - 0s - loss: 0.0254 - val_loss: 0.0122 Epoch 463/500 - 0s - loss: 0.0254 - val_loss: 0.0121 Epoch 464/500 - 0s - loss: 0.0254 - val_loss: 0.0121 Epoch 465/500 - 0s - loss: 0.0255 - val_loss: 0.0120 Epoch 466/500 - 0s - loss: 0.0254 - val_loss: 0.0122 Epoch 467/500 - 0s - loss: 0.0254 - val_loss: 0.0121 Epoch 468/500 - 0s - loss: 0.0254 - val_loss: 0.0122 Epoch 469/500 - 0s - loss: 0.0254 - val_loss: 0.0121 Epoch 470/500 - 0s - loss: 0.0254 - val_loss: 0.0122 Epoch 471/500 - 0s - loss: 0.0254 - val_loss: 0.0120 Epoch 472/500 - 0s - loss: 0.0254 - val_loss: 0.0121 Epoch 473/500 - 0s - loss: 0.0254 - val_loss: 0.0123 Epoch 474/500 - 0s - loss: 0.0254 - val_loss: 0.0120 Epoch 475/500 - 0s - loss: 0.0254 - val_loss: 0.0123 Epoch 476/500 - 0s - loss: 0.0254 - val_loss: 0.0121 Epoch 477/500 - 0s - loss: 0.0254 - val_loss: 0.0119 Epoch 478/500 - 0s - loss: 0.0254 - val_loss: 0.0122 Epoch 479/500 - 0s - loss: 0.0254 - val_loss: 0.0120 Epoch 480/500 - 0s - loss: 0.0254 - val_loss: 0.0120 Epoch 481/500 - 0s - loss: 0.0254 - val_loss: 0.0120 Epoch 482/500 - 0s - loss: 0.0254 - val_loss: 0.0123 Epoch 483/500 - 0s - loss: 0.0254 - val_loss: 0.0120 Epoch 484/500 - 0s - loss: 0.0254 - val_loss: 0.0121 Epoch 485/500 - 0s - loss: 0.0254 - val_loss: 0.0121 Epoch 486/500 - 0s - loss: 0.0254 - val_loss: 0.0120 Epoch 487/500 - 0s - loss: 0.0254 - val_loss: 0.0123 Epoch 488/500 - 0s - loss: 0.0254 - val_loss: 0.0121 Epoch 489/500 - 0s - loss: 0.0254 - val_loss: 0.0120 Epoch 490/500 - 0s - loss: 0.0254 - val_loss: 0.0123 Epoch 491/500 - 0s - loss: 0.0254 - val_loss: 0.0120 Epoch 492/500 - 0s - loss: 0.0254 - val_loss: 0.0120 Epoch 493/500 - 0s - loss: 0.0254 - val_loss: 0.0122 Epoch 494/500 - 0s - loss: 0.0254 - val_loss: 0.0120 Epoch 495/500 - 0s - loss: 0.0254 - val_loss: 0.0120 Epoch 496/500 - 0s - loss: 0.0254 - val_loss: 0.0121 Epoch 497/500 - 0s - loss: 0.0254 - val_loss: 0.0120 Epoch 498/500 - 0s - loss: 0.0253 - val_loss: 0.0120 Epoch 499/500 - 0s - loss: 0.0254 - val_loss: 0.0122 Epoch 500/500 - 0s - loss: 0.0254 - val_loss: 0.0120
pyplot.plot(history['loss'], label='train')
pyplot.plot(history['val_loss'], label='validation')
pyplot.legend()
pyplot.show()
# make a prediction
%load_ext autoreload
%autoreload 2
import models
inv_yhat, inv_y, rmse=models.make_lstm_prediction(validation_X,validation_y,model,scaler)
print('LSTM Model on Validation Data RMSE: %.3f' % rmse)
The autoreload extension is already loaded. To reload it, use: %reload_ext autoreload LSTM Model on Validation Data RMSE: 8.958
# make a prediction
%load_ext autoreload
%autoreload 2
import models
inv_yhat, inv_y, rmse=models.make_lstm_prediction(test_X,test_y,model,scaler)
print('LSTM Moddel on Test Data RMSE: %.3f' % rmse)
The autoreload extension is already loaded. To reload it, use: %reload_ext autoreload LSTM Moddel on Test Data RMSE: 9.211